Mar 12 13:35:35.362142 ip-10-0-142-16 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Mar 12 13:35:35.362313 ip-10-0-142-16 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Mar 12 13:35:35.362451 ip-10-0-142-16 systemd[1]: kubelet.service: Failed with result 'resources'. Mar 12 13:35:35.363006 ip-10-0-142-16 systemd[1]: Failed to start Kubernetes Kubelet. Mar 12 13:35:45.362035 ip-10-0-142-16 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Mar 12 13:35:45.362058 ip-10-0-142-16 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 98ae652a6c06470fad697f48b621e293 -- Mar 12 13:37:44.347786 ip-10-0-142-16 systemd[1]: Starting Kubernetes Kubelet... Mar 12 13:37:44.860240 ip-10-0-142-16 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:37:44.860240 ip-10-0-142-16 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 13:37:44.860240 ip-10-0-142-16 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:37:44.860240 ip-10-0-142-16 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 12 13:37:44.860240 ip-10-0-142-16 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 13:37:44.861320 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.861225 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 13:37:44.863803 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863785 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:44.863803 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863803 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863807 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863812 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863816 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863821 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863826 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863829 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863832 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863835 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863838 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863840 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863843 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863846 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863848 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863851 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863853 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863856 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863858 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863861 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:44.863868 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863863 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863866 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863879 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863882 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863884 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863887 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863889 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863892 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863894 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863897 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863900 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863903 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863905 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863919 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863925 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863929 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863932 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863935 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863938 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:44.864331 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863940 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863943 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863946 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863948 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863952 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863954 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863957 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863959 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863962 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863965 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863967 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863971 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863974 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863977 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863979 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863983 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863986 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863989 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863991 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863994 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:44.864805 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863996 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.863999 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864001 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864004 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864006 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864008 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864011 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864014 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864016 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864018 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864021 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864023 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864025 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864028 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864031 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864034 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864038 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864040 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864043 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864046 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:44.865286 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864048 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:44.865800 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864051 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:44.865800 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864053 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:44.865800 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864056 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:44.865800 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864058 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:44.865800 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864061 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:44.865800 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.864064 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:44.866676 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866663 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:44.866676 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866677 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866680 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866684 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866687 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866690 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866693 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866696 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866698 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866701 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866704 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866707 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866709 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866712 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866714 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866717 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866719 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866722 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866724 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866727 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866730 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:44.866741 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866733 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866735 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866738 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866740 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866743 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866746 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866749 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866751 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866754 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866757 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866759 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866762 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866764 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866767 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866769 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866772 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866774 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866776 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866779 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866781 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:44.867216 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866783 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866786 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866788 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866790 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866793 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866795 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866797 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866801 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866803 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866805 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866808 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866810 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866813 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866816 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866822 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866825 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866828 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866831 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866834 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866837 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:44.867730 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866840 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866843 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866845 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866848 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866850 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866853 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866856 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866858 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866861 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866863 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866867 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866869 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866872 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866874 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866876 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866879 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866881 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866883 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866886 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:44.868219 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866888 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866892 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866895 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866898 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866901 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.866903 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.866981 2576 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.866990 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.866997 2576 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867002 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867007 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867010 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867015 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867019 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867023 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867027 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867030 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867034 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867037 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867040 2576 flags.go:64] FLAG: --cgroup-root="" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867043 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867046 2576 flags.go:64] FLAG: --client-ca-file="" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867049 2576 flags.go:64] FLAG: --cloud-config="" Mar 12 13:37:44.868690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867052 2576 flags.go:64] FLAG: --cloud-provider="external" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867055 2576 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867059 2576 flags.go:64] FLAG: --cluster-domain="" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867062 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867065 2576 flags.go:64] FLAG: --config-dir="" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867068 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867071 2576 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867076 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867079 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867082 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867085 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867088 2576 flags.go:64] FLAG: --contention-profiling="false" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867092 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867095 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867099 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867101 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867107 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867111 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867113 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867117 2576 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867120 2576 flags.go:64] FLAG: --enable-server="true" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867123 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867128 2576 flags.go:64] FLAG: --event-burst="100" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867131 2576 flags.go:64] FLAG: --event-qps="50" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867134 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 13:37:44.869243 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867137 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867140 2576 flags.go:64] FLAG: --eviction-hard="" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867144 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867147 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867150 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867153 2576 flags.go:64] FLAG: --eviction-soft="" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867155 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867158 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867161 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867164 2576 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867167 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867169 2576 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867172 2576 flags.go:64] FLAG: --feature-gates="" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867176 2576 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867179 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867182 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867185 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867188 2576 flags.go:64] FLAG: --healthz-port="10248" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867191 2576 flags.go:64] FLAG: --help="false" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867194 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-142-16.ec2.internal" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867197 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867200 2576 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867203 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867207 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Mar 12 13:37:44.869921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867210 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867214 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867217 2576 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867220 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867223 2576 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867226 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867230 2576 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867233 2576 flags.go:64] FLAG: --kube-reserved="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867236 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867239 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867242 2576 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867245 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867248 2576 flags.go:64] FLAG: --lock-file="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867251 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867254 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867257 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867262 2576 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867265 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867268 2576 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867271 2576 flags.go:64] FLAG: --logging-format="text" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867274 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867277 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867280 2576 flags.go:64] FLAG: --manifest-url="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867282 2576 flags.go:64] FLAG: --manifest-url-header="" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867287 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 13:37:44.870521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867290 2576 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867294 2576 flags.go:64] FLAG: --max-pods="110" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867297 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867300 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867303 2576 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867307 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867312 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867316 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867321 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867330 2576 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867333 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867336 2576 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867339 2576 flags.go:64] FLAG: --pod-cidr="" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867342 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b3115b2610585407ab0742648cfbe39c72f57482889f0e778f5ac6fdc482217b" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867348 2576 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867351 2576 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867354 2576 flags.go:64] FLAG: --pods-per-core="0" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867356 2576 flags.go:64] FLAG: --port="10250" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867359 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867362 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f61c1b1ae1754c6c" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867366 2576 flags.go:64] FLAG: --qos-reserved="" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867368 2576 flags.go:64] FLAG: --read-only-port="10255" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867371 2576 flags.go:64] FLAG: --register-node="true" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867374 2576 flags.go:64] FLAG: --register-schedulable="true" Mar 12 13:37:44.871147 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867377 2576 flags.go:64] FLAG: --register-with-taints="" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867381 2576 flags.go:64] FLAG: --registry-burst="10" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867383 2576 flags.go:64] FLAG: --registry-qps="5" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867386 2576 flags.go:64] FLAG: --reserved-cpus="" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867389 2576 flags.go:64] FLAG: --reserved-memory="" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867392 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867395 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867398 2576 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867401 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867407 2576 flags.go:64] FLAG: --runonce="false" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867410 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867413 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867416 2576 flags.go:64] FLAG: --seccomp-default="false" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867419 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867422 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867426 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867429 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867433 2576 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867436 2576 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867440 2576 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867442 2576 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867445 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867448 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867451 2576 flags.go:64] FLAG: --system-cgroups="" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867469 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 12 13:37:44.871763 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867475 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867478 2576 flags.go:64] FLAG: --tls-cert-file="" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867481 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867485 2576 flags.go:64] FLAG: --tls-min-version="" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867488 2576 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867491 2576 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867494 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867497 2576 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867500 2576 flags.go:64] FLAG: --v="2" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867505 2576 flags.go:64] FLAG: --version="false" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867510 2576 flags.go:64] FLAG: --vmodule="" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867514 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.867517 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867618 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867622 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867625 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867629 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867632 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867635 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867649 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867653 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867656 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867662 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:44.872397 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867664 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867667 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867669 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867672 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867675 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867678 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867680 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867683 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867685 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867688 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867690 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867693 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867696 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867698 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867700 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867703 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867706 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867708 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867712 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:44.873001 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867716 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867718 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867721 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867723 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867726 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867728 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867733 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867736 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867739 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867741 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867744 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867747 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867751 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867753 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867756 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867758 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867761 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867765 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867767 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867770 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:44.873514 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867772 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867775 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867778 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867780 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867782 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867785 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867787 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867790 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867792 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867795 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867798 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867800 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867803 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867805 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867808 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867811 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867813 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867816 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867820 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:44.874035 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867824 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867827 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867830 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867833 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867836 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867840 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867843 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867846 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867849 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867851 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867854 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867858 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867861 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867863 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867866 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867869 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867871 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:44.874548 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.867874 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:44.874982 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.868699 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 13:37:44.875627 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.875608 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.8" Mar 12 13:37:44.875659 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.875627 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 13:37:44.875692 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875678 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:44.875692 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875684 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:44.875692 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875687 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:44.875692 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875691 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875694 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875697 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875700 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875703 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875706 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875709 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875712 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875715 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875717 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875720 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875722 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875725 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875727 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875730 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875733 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875736 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875738 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875741 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875743 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:44.875794 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875746 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875749 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875751 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875754 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875757 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875759 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875762 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875764 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875767 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875770 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875773 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875775 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875778 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875782 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875786 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875790 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875793 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875795 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875798 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:44.876307 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875802 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875807 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875810 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875813 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875816 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875818 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875821 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875824 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875827 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875829 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875832 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875835 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875837 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875840 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875842 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875845 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875847 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875850 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875852 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875855 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:44.876825 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875857 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875860 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875863 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875865 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875868 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875870 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875873 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875876 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875879 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875882 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875884 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875887 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875889 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875892 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875894 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875896 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875899 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875901 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875916 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875919 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:44.877314 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875922 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875924 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875927 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.875930 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.875935 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876035 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876040 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876043 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876046 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876049 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876052 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876054 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876057 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876059 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876062 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Mar 12 13:37:44.877823 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876066 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876068 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876071 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876073 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876076 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876079 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876081 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876084 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876086 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876089 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876091 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876094 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876096 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876099 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876101 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876104 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876106 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876109 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876111 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876114 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Mar 12 13:37:44.878186 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876116 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876119 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876121 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876123 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876126 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876128 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876131 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876133 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876136 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876139 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876141 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876144 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876147 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876150 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876152 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876155 2576 feature_gate.go:328] unrecognized feature gate: Example Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876158 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876160 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876163 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 13:37:44.878688 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876166 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876168 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876171 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876173 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876175 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876178 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876180 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876183 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876185 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876188 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876190 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876193 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876195 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876198 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876200 2576 feature_gate.go:328] unrecognized feature gate: Example2 Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876203 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876205 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876208 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876210 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876213 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 13:37:44.879154 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876215 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876218 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876220 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876222 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876225 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876227 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876231 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876235 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876239 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876242 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876246 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876249 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876251 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876254 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876256 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876258 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Mar 12 13:37:44.879720 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:44.876261 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Mar 12 13:37:44.880190 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.876266 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Mar 12 13:37:44.880190 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.876393 2576 server.go:962] "Client rotation is on, will bootstrap in background" Mar 12 13:37:44.881006 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.880991 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 13:37:44.882061 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.882048 2576 server.go:1019] "Starting client certificate rotation" Mar 12 13:37:44.882178 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.882156 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 13:37:44.882212 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.882198 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Mar 12 13:37:44.911584 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.911557 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:37:44.915068 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.914975 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 13:37:44.933479 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.933443 2576 log.go:25] "Validated CRI v1 runtime API" Mar 12 13:37:44.939477 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.939443 2576 log.go:25] "Validated CRI v1 image API" Mar 12 13:37:44.941551 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.941530 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 13:37:44.945986 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.945962 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b3e6445f-878b-4797-9ec4-dbc6114971f7:/dev/nvme0n1p3 c59cf079-6e84-445b-a5a3-07a1e5793f98:/dev/nvme0n1p4] Mar 12 13:37:44.946081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.945986 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Mar 12 13:37:44.951286 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.951265 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 12 13:37:44.952288 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.952167 2576 manager.go:217] Machine: {Timestamp:2026-03-12 13:37:44.950152618 +0000 UTC m=+0.462086636 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100072 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22a50e1650419f4ceb19bef90e924a SystemUUID:ec22a50e-1650-419f-4ceb-19bef90e924a BootID:98ae652a-6c06-470f-ad69-7f48b621e293 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6094848 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c3:d7:5c:7c:b3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c3:d7:5c:7c:b3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:f9:0e:bb:d9:f5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 13:37:44.952288 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.952287 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 13:37:44.952400 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.952387 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.96.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260303-1 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 13:37:44.953711 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.953681 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 13:37:44.953869 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.953713 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-16.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 13:37:44.953916 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.953878 2576 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 13:37:44.953916 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.953887 2576 container_manager_linux.go:306] "Creating device plugin manager" Mar 12 13:37:44.953916 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.953905 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 13:37:44.954813 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.954802 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 13:37:44.956765 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.956753 2576 state_mem.go:36] "Initialized new in-memory state store" Mar 12 13:37:44.956889 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.956880 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Mar 12 13:37:44.959721 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.959709 2576 kubelet.go:491] "Attempting to sync node with API server" Mar 12 13:37:44.959759 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.959726 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 13:37:44.959759 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.959739 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 13:37:44.959759 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.959750 2576 kubelet.go:397] "Adding apiserver pod source" Mar 12 13:37:44.959853 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.959764 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 13:37:44.961056 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.961041 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 12 13:37:44.961138 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.961061 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Mar 12 13:37:44.964824 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.964795 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.9-3.rhaos4.20.gitb9ac835.el9" apiVersion="v1" Mar 12 13:37:44.968203 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.968185 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 12 13:37:44.970919 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970899 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970926 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970935 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970941 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970947 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970953 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970959 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970964 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970971 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970978 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970987 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 13:37:44.971005 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.970995 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 13:37:44.971948 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.971933 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 13:37:44.971948 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.971945 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Mar 12 13:37:44.972333 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.972305 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Mar 12 13:37:44.972436 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.972314 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Mar 12 13:37:44.975903 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.975888 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 12 13:37:44.975988 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.975927 2576 server.go:1295] "Started kubelet" Mar 12 13:37:44.976043 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.976003 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 13:37:44.976148 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.976080 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 13:37:44.976200 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.976166 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 12 13:37:44.977511 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.977495 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 13:37:44.977584 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.977558 2576 server.go:317] "Adding debug handlers to kubelet server" Mar 12 13:37:44.977602 ip-10-0-142-16 systemd[1]: Started Kubernetes Kubelet. Mar 12 13:37:44.985358 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.985327 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-16.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 13:37:44.986072 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.986045 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Mar 12 13:37:44.986567 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.985380 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-16.ec2.internal.189c1b917e09a4be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-16.ec2.internal,UID:ip-10-0-142-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-16.ec2.internal,},FirstTimestamp:2026-03-12 13:37:44.975901886 +0000 UTC m=+0.487835904,LastTimestamp:2026-03-12 13:37:44.975901886 +0000 UTC m=+0.487835904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-16.ec2.internal,}" Mar 12 13:37:44.986708 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.986645 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 13:37:44.988272 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.988241 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 12 13:37:44.988370 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988354 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Mar 12 13:37:44.988439 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988356 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Mar 12 13:37:44.988439 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988390 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 12 13:37:44.988584 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.988566 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:44.988636 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988610 2576 factory.go:55] Registering systemd factory Mar 12 13:37:44.988636 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988626 2576 factory.go:223] Registration of the systemd container factory successfully Mar 12 13:37:44.988826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988800 2576 reconstruct.go:97] "Volume reconstruction finished" Mar 12 13:37:44.988921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.988827 2576 reconciler.go:26] "Reconciler: start to sync state" Mar 12 13:37:44.990089 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.990073 2576 factory.go:153] Registering CRI-O factory Mar 12 13:37:44.990089 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.990090 2576 factory.go:223] Registration of the crio container factory successfully Mar 12 13:37:44.990201 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.990161 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 13:37:44.990201 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.990188 2576 factory.go:103] Registering Raw factory Mar 12 13:37:44.990262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.990217 2576 manager.go:1196] Started watching for new ooms in manager Mar 12 13:37:44.990805 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:44.990791 2576 manager.go:319] Starting recovery of all containers Mar 12 13:37:44.991224 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.991197 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-16.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Mar 12 13:37:44.991339 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:44.991277 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Mar 12 13:37:45.002477 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.002428 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bf8cv" Mar 12 13:37:45.003194 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.003180 2576 manager.go:324] Recovery completed Mar 12 13:37:45.007492 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.007476 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:45.010484 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.010452 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bf8cv" Mar 12 13:37:45.010589 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.010472 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:45.010589 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.010542 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:45.010589 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.010560 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:45.011162 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.011149 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Mar 12 13:37:45.011162 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.011162 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Mar 12 13:37:45.011246 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.011179 2576 state_mem.go:36] "Initialized new in-memory state store" Mar 12 13:37:45.012183 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.012106 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-16.ec2.internal.189c1b918019caee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-16.ec2.internal,UID:ip-10-0-142-16.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-142-16.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-142-16.ec2.internal,},FirstTimestamp:2026-03-12 13:37:45.01051467 +0000 UTC m=+0.522448692,LastTimestamp:2026-03-12 13:37:45.01051467 +0000 UTC m=+0.522448692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-16.ec2.internal,}" Mar 12 13:37:45.014067 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.014050 2576 policy_none.go:49] "None policy: Start" Mar 12 13:37:45.014140 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.014072 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 12 13:37:45.014140 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.014087 2576 state_mem.go:35] "Initializing new in-memory state store" Mar 12 13:37:45.066264 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066244 2576 manager.go:341] "Starting Device Plugin manager" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.066284 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066295 2576 server.go:85] "Starting device plugin registration server" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066617 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066632 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066717 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066802 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.066810 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.067442 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.067533 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.079552 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.080916 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.080949 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.080975 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.080982 2576 kubelet.go:2451] "Starting kubelet main sync loop" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.081016 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 12 13:37:45.089652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.083906 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:45.167350 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.167317 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:45.168987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.168973 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:45.169077 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.169003 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:45.169077 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.169014 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:45.169077 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.169039 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.176533 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.176511 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.176533 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.176535 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-16.ec2.internal\": node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.181874 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.181854 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal"] Mar 12 13:37:45.181967 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.181934 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:45.182920 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.182904 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:45.183019 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.182934 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:45.183019 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.182948 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:45.185334 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.185318 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:45.185488 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.185472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.185538 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.185505 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:45.186091 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.186063 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:45.186091 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.186063 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:45.186209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.186097 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:45.186209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.186122 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:45.186209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.186125 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:45.186209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.186142 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:45.188415 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.188396 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.188531 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.188433 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Mar 12 13:37:45.189201 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.189183 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientMemory" Mar 12 13:37:45.189286 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.189211 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasNoDiskPressure" Mar 12 13:37:45.189286 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.189227 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeHasSufficientPID" Mar 12 13:37:45.189286 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.189250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6c70f15327caadb681531ec95696eaee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"6c70f15327caadb681531ec95696eaee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.189401 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.189286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c70f15327caadb681531ec95696eaee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"6c70f15327caadb681531ec95696eaee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.197985 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.197965 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.210285 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.210260 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-16.ec2.internal\" not found" node="ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.214982 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.214965 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-16.ec2.internal\" not found" node="ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.290057 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.290017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6c70f15327caadb681531ec95696eaee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"6c70f15327caadb681531ec95696eaee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.290057 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.290055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c70f15327caadb681531ec95696eaee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"6c70f15327caadb681531ec95696eaee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.290260 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.290090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6158045a7611177883f6a632f015315d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-16.ec2.internal\" (UID: \"6158045a7611177883f6a632f015315d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.290260 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.290132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6c70f15327caadb681531ec95696eaee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"6c70f15327caadb681531ec95696eaee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.290260 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.290132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6c70f15327caadb681531ec95696eaee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal\" (UID: \"6c70f15327caadb681531ec95696eaee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.298791 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.298725 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.390270 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.390224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6158045a7611177883f6a632f015315d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-16.ec2.internal\" (UID: \"6158045a7611177883f6a632f015315d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.390361 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.390293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6158045a7611177883f6a632f015315d-config\") pod \"kube-apiserver-proxy-ip-10-0-142-16.ec2.internal\" (UID: \"6158045a7611177883f6a632f015315d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.399329 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.399303 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.499637 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.499602 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.514833 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.514803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.516375 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.516357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Mar 12 13:37:45.600533 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.600409 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.700919 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.700875 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.801453 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.801416 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.881698 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.881606 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 13:37:45.882221 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.881765 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Mar 12 13:37:45.902053 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:45.902026 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:45.947731 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.947696 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:45.986870 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.986837 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Mar 12 13:37:46.000017 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:45.999986 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Mar 12 13:37:46.002171 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:46.002149 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:46.013164 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.013113 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-03-11 13:32:45 +0000 UTC" deadline="2027-10-25 23:29:34.396029264 +0000 UTC" Mar 12 13:37:46.013164 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.013159 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14217h51m48.382874143s" Mar 12 13:37:46.021405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.021378 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-l4thp" Mar 12 13:37:46.029666 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.029646 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-l4thp" Mar 12 13:37:46.102699 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:46.102663 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:46.203254 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:46.203186 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-16.ec2.internal\" not found" Mar 12 13:37:46.207606 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.207584 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:46.288092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.288048 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" Mar 12 13:37:46.300137 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.300108 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 13:37:46.301249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.301233 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" Mar 12 13:37:46.320358 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.320323 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Mar 12 13:37:46.374271 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:46.374232 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6158045a7611177883f6a632f015315d.slice/crio-a39f6e216700cca45e7c21891d872003119b4fe705f89ae7ec0fb3cf167f66b5 WatchSource:0}: Error finding container a39f6e216700cca45e7c21891d872003119b4fe705f89ae7ec0fb3cf167f66b5: Status 404 returned error can't find the container with id a39f6e216700cca45e7c21891d872003119b4fe705f89ae7ec0fb3cf167f66b5 Mar 12 13:37:46.379492 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.379451 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:37:46.383029 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:46.383004 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c70f15327caadb681531ec95696eaee.slice/crio-a7139f0179e0b8e18742f72c584bb21619ea08c3566b00322d63ca067412edd3 WatchSource:0}: Error finding container a7139f0179e0b8e18742f72c584bb21619ea08c3566b00322d63ca067412edd3: Status 404 returned error can't find the container with id a7139f0179e0b8e18742f72c584bb21619ea08c3566b00322d63ca067412edd3 Mar 12 13:37:46.400938 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.400915 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:46.747578 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.747546 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Mar 12 13:37:46.960673 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.960639 2576 apiserver.go:52] "Watching apiserver" Mar 12 13:37:46.966768 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.966737 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Mar 12 13:37:46.968216 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.968190 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6","openshift-cluster-node-tuning-operator/tuned-rjb6h","openshift-multus/multus-gd9sq","openshift-ovn-kubernetes/ovnkube-node-8qtnl","openshift-image-registry/node-ca-p28bf","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal","openshift-multus/multus-additional-cni-plugins-6wcbt","openshift-multus/network-metrics-daemon-wbtg8","openshift-network-diagnostics/network-check-target-fnfhc","openshift-network-operator/iptables-alerter-trfvm","kube-system/global-pull-secret-syncer-x4rlt","kube-system/konnectivity-agent-w7qxr","kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal"] Mar 12 13:37:46.972519 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.972498 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:46.974717 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.974691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.977221 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.977198 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.977342 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.977299 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Mar 12 13:37:46.977881 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.977862 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Mar 12 13:37:46.977983 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.977933 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:37:46.978043 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.978009 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Mar 12 13:37:46.978437 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.978424 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Mar 12 13:37:46.978507 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.978443 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wq45j\"" Mar 12 13:37:46.979126 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.979025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8w8rg\"" Mar 12 13:37:46.979126 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.979114 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Mar 12 13:37:46.979240 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.979133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Mar 12 13:37:46.979583 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.979568 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.981419 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.981405 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-4s45x\"" Mar 12 13:37:46.981726 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.981709 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Mar 12 13:37:46.981912 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.981888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:46.984580 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.984562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.987515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.986826 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Mar 12 13:37:46.987515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.986958 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s9hkx\"" Mar 12 13:37:46.987515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.987076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Mar 12 13:37:46.987515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.987235 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Mar 12 13:37:46.987880 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.987664 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Mar 12 13:37:46.987880 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.987746 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Mar 12 13:37:46.989125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.989103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:46.989285 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:46.989258 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:37:46.991776 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.991756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:46.991862 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:46.991834 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:46.994189 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.994168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:46.995554 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995528 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Mar 12 13:37:46.995650 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995601 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Mar 12 13:37:46.995650 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995615 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Mar 12 13:37:46.995743 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995711 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Mar 12 13:37:46.995849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995834 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Mar 12 13:37:46.995926 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-2jq4m\"" Mar 12 13:37:46.995978 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.995926 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Mar 12 13:37:46.996061 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.996045 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Mar 12 13:37:46.996185 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.996169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2xpcw\"" Mar 12 13:37:46.996808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.996792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:46.998270 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysctl-conf\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.998270 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998251 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdbk5\" (UniqueName: \"kubernetes.io/projected/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-kube-api-access-kdbk5\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dlpd\" (UniqueName: \"kubernetes.io/projected/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-kube-api-access-2dlpd\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-slash\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1aca7d52-35bb-4dbf-8bb1-4a032145c336-serviceca\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998323 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998337 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-sys-fs\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysconfig\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.998394 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysctl-d\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-cni-multus\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-kubelet\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-daemon-config\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-system-cni-dir\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998539 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998552 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-k8s-cni-cncf-io\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998599 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-log-socket\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998640 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zxwpr\"" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-device-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.998734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-modprobe-d\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-run\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-lib-modules\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998807 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-cni-bin\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-multus-certs\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6pj\" (UniqueName: \"kubernetes.io/projected/990c57e8-518a-4bf2-b8ee-a937df89bcb3-kube-api-access-ft6pj\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf0068fb-0001-44e4-9194-485a2618370f-tmp\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-var-lib-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998939 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-os-release\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.998995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-sys\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-run-netns\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999021 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999024 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-systemd\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-cni-netd\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999063 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-env-overrides\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999020 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Mar 12 13:37:46.999309 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:46.999071 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-os-release\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-kubernetes\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-netns\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-conf-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999188 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-cni-bin\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aca7d52-35bb-4dbf-8bb1-4a032145c336-host\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-var-lib-kubelet\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-cni-binary-copy\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999346 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5825179-9ef4-493d-9fc5-0868f3084a95-ovn-node-metrics-cert\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fbjg\" (UniqueName: \"kubernetes.io/projected/e5825179-9ef4-493d-9fc5-0868f3084a95-kube-api-access-6fbjg\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-system-cni-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999419 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-hostroot\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999449 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-etc-selinux\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-node-log\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-cni-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-registration-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:46.999946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-host\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-etc-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999639 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-ovn\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999654 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-socket-dir-parent\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999684 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-etc-kubernetes\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-systemd\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf0068fb-0001-44e4-9194-485a2618370f-etc-tuned\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-ovnkube-config\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-systemd-units\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-cnibin\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hg2b\" (UniqueName: \"kubernetes.io/projected/9288d2b3-d104-4c0d-8417-6d70d1c70098-kube-api-access-5hg2b\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fljp6\" (UniqueName: \"kubernetes.io/projected/bf0068fb-0001-44e4-9194-485a2618370f-kube-api-access-fljp6\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:46.999990 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-kubelet\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.000593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-ovnkube-script-lib\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf969\" (UniqueName: \"kubernetes.io/projected/1aca7d52-35bb-4dbf-8bb1-4a032145c336-kube-api-access-gf969\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-cnibin\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-socket-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000425 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-z88dg\"" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000570 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Mar 12 13:37:47.001340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.000606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Mar 12 13:37:47.032874 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.032843 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-11 13:32:46 +0000 UTC" deadline="2027-09-30 16:08:44.23741271 +0000 UTC" Mar 12 13:37:47.032874 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.032870 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13610h30m57.204545952s" Mar 12 13:37:47.086283 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.086231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" event={"ID":"6c70f15327caadb681531ec95696eaee","Type":"ContainerStarted","Data":"a7139f0179e0b8e18742f72c584bb21619ea08c3566b00322d63ca067412edd3"} Mar 12 13:37:47.087535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.087492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" event={"ID":"6158045a7611177883f6a632f015315d","Type":"ContainerStarted","Data":"a39f6e216700cca45e7c21891d872003119b4fe705f89ae7ec0fb3cf167f66b5"} Mar 12 13:37:47.089511 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.089490 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Mar 12 13:37:47.100864 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-sys-fs\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysconfig\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysctl-d\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-cni-multus\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-kubelet\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-daemon-config\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-system-cni-dir\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.101015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.100999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-sys-fs\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101061 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-cni-multus\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-k8s-cni-cncf-io\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-k8s-cni-cncf-io\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-log-socket\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-kubelet\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-iptables-alerter-script\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drrgf\" (UniqueName: \"kubernetes.io/projected/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-kube-api-access-drrgf\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c9fffc5-b110-46de-8d20-728d29e23748-konnectivity-ca\") pod \"konnectivity-agent-w7qxr\" (UID: \"7c9fffc5-b110-46de-8d20-728d29e23748\") " pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.101303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-device-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-modprobe-d\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-run\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-lib-modules\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-cni-bin\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-multus-certs\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ft6pj\" (UniqueName: \"kubernetes.io/projected/990c57e8-518a-4bf2-b8ee-a937df89bcb3-kube-api-access-ft6pj\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf0068fb-0001-44e4-9194-485a2618370f-tmp\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-var-lib-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-host-slash\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-os-release\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-sys\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-run-netns\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-daemon-config\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.101754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-systemd\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-cni-netd\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-env-overrides\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-cni-netd\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-os-release\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-kubernetes\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101964 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c9fffc5-b110-46de-8d20-728d29e23748-agent-certs\") pod \"konnectivity-agent-w7qxr\" (UID: \"7c9fffc5-b110-46de-8d20-728d29e23748\") " pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-netns\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysctl-d\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-conf-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-log-socket\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-cni-bin\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aca7d52-35bb-4dbf-8bb1-4a032145c336-host\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-device-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-var-lib-kubelet\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05421762-de84-4ca6-bedf-542c55460252-dbus\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-system-cni-dir\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102623 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-cni-binary-copy\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102764 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5825179-9ef4-493d-9fc5-0868f3084a95-ovn-node-metrics-cert\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.102841 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fbjg\" (UniqueName: \"kubernetes.io/projected/e5825179-9ef4-493d-9fc5-0868f3084a95-kube-api-access-6fbjg\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102867 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.102922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-var-lib-kubelet\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-run\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-os-release\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-system-cni-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.103145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103134 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-run-netns\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.103486 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-hostroot\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.103486 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-system-cni-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.103486 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-etc-selinux\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.103486 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-hostroot\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.103486 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103315 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-modprobe-d\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.103486 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-kubernetes\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.103980 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.103959 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.104193 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.104266 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104210 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 13:37:47.104506 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-os-release\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.104622 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-conf-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.104622 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104613 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-netns\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.104722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-env-overrides\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.104722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.101879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysconfig\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.104722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104688 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-cni-bin\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.104722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-var-lib-cni-bin\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.104722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1aca7d52-35bb-4dbf-8bb1-4a032145c336-host\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.104926 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-systemd\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.104926 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104823 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-var-lib-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.104926 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-sys\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.105052 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.104956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-node-log\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.105052 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-cni-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.105052 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-registration-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.105224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-host\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.105224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-etc-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.105224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-ovn\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.105224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105146 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-etc-selinux\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.105224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05421762-de84-4ca6-bedf-542c55460252-kubelet-config\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.105224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:47.105501 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-lib-modules\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.105501 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.105501 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-socket-dir-parent\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.105501 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-cni-binary-copy\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.105685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-etc-kubernetes\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.105685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-systemd\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.105685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf0068fb-0001-44e4-9194-485a2618370f-etc-tuned\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.105685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-host-run-multus-certs\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.105685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-ovnkube-config\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.105685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.105927 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-systemd-units\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.105927 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.105718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-cnibin\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.106119 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hg2b\" (UniqueName: \"kubernetes.io/projected/9288d2b3-d104-4c0d-8417-6d70d1c70098-kube-api-access-5hg2b\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.106418 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fljp6\" (UniqueName: \"kubernetes.io/projected/bf0068fb-0001-44e4-9194-485a2618370f-kube-api-access-fljp6\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.106418 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-kubelet\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.106418 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.106613 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-ovnkube-config\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.106613 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-cni-dir\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.106710 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-registration-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.106710 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-node-log\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.106801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-host\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.106801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-etc-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.106898 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-ovn\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.107022 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.106982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.107184 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-ovnkube-script-lib\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.107346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf969\" (UniqueName: \"kubernetes.io/projected/1aca7d52-35bb-4dbf-8bb1-4a032145c336-kube-api-access-gf969\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.107346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-cnibin\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.107489 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107391 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.107548 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107498 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-multus-socket-dir-parent\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.107607 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-systemd\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.107777 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-socket-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.107777 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysctl-conf\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdbk5\" (UniqueName: \"kubernetes.io/projected/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-kube-api-access-kdbk5\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dlpd\" (UniqueName: \"kubernetes.io/projected/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-kube-api-access-2dlpd\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-slash\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.107998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9288d2b3-d104-4c0d-8417-6d70d1c70098-cni-binary-copy\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1aca7d52-35bb-4dbf-8bb1-4a032145c336-serviceca\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-etc-kubernetes\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-kubelet\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/990c57e8-518a-4bf2-b8ee-a937df89bcb3-socket-dir\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-systemd-units\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.108405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-cnibin\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.108932 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-run-openvswitch\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.108932 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bf0068fb-0001-44e4-9194-485a2618370f-etc-sysctl-conf\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.108932 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108676 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:47.108932 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-run-ovn-kubernetes\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.108932 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.108906 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:47.109161 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.108909 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9288d2b3-d104-4c0d-8417-6d70d1c70098-cnibin\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.109161 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.109004 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:37:47.608982727 +0000 UTC m=+3.120916747 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:47.109418 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.109290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5825179-9ef4-493d-9fc5-0868f3084a95-host-slash\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.109418 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.109386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1aca7d52-35bb-4dbf-8bb1-4a032145c336-serviceca\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.109566 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.109503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf0068fb-0001-44e4-9194-485a2618370f-tmp\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.109619 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.109579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e5825179-9ef4-493d-9fc5-0868f3084a95-ovnkube-script-lib\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.111378 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.111352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e5825179-9ef4-493d-9fc5-0868f3084a95-ovn-node-metrics-cert\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.111616 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.111575 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bf0068fb-0001-44e4-9194-485a2618370f-etc-tuned\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.134251 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.134173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dlpd\" (UniqueName: \"kubernetes.io/projected/b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39-kube-api-access-2dlpd\") pod \"multus-gd9sq\" (UID: \"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39\") " pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.134966 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.134889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf969\" (UniqueName: \"kubernetes.io/projected/1aca7d52-35bb-4dbf-8bb1-4a032145c336-kube-api-access-gf969\") pod \"node-ca-p28bf\" (UID: \"1aca7d52-35bb-4dbf-8bb1-4a032145c336\") " pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.136344 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.136316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fljp6\" (UniqueName: \"kubernetes.io/projected/bf0068fb-0001-44e4-9194-485a2618370f-kube-api-access-fljp6\") pod \"tuned-rjb6h\" (UID: \"bf0068fb-0001-44e4-9194-485a2618370f\") " pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.137106 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.137031 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hg2b\" (UniqueName: \"kubernetes.io/projected/9288d2b3-d104-4c0d-8417-6d70d1c70098-kube-api-access-5hg2b\") pod \"multus-additional-cni-plugins-6wcbt\" (UID: \"9288d2b3-d104-4c0d-8417-6d70d1c70098\") " pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.137691 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.137661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft6pj\" (UniqueName: \"kubernetes.io/projected/990c57e8-518a-4bf2-b8ee-a937df89bcb3-kube-api-access-ft6pj\") pod \"aws-ebs-csi-driver-node-f9rk6\" (UID: \"990c57e8-518a-4bf2-b8ee-a937df89bcb3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.165246 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.165209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fbjg\" (UniqueName: \"kubernetes.io/projected/e5825179-9ef4-493d-9fc5-0868f3084a95-kube-api-access-6fbjg\") pod \"ovnkube-node-8qtnl\" (UID: \"e5825179-9ef4-493d-9fc5-0868f3084a95\") " pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.165246 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.165244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdbk5\" (UniqueName: \"kubernetes.io/projected/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-kube-api-access-kdbk5\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:47.209743 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.209743 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05421762-de84-4ca6-bedf-542c55460252-kubelet-config\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-iptables-alerter-script\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.209843 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drrgf\" (UniqueName: \"kubernetes.io/projected/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-kube-api-access-drrgf\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c9fffc5-b110-46de-8d20-728d29e23748-konnectivity-ca\") pod \"konnectivity-agent-w7qxr\" (UID: \"7c9fffc5-b110-46de-8d20-728d29e23748\") " pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.209907 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:37:47.709888796 +0000 UTC m=+3.221822815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-host-slash\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.209987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.209978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c9fffc5-b110-46de-8d20-728d29e23748-agent-certs\") pod \"konnectivity-agent-w7qxr\" (UID: \"7c9fffc5-b110-46de-8d20-728d29e23748\") " pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.210318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.210005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05421762-de84-4ca6-bedf-542c55460252-dbus\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.210318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.210201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05421762-de84-4ca6-bedf-542c55460252-dbus\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.210500 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.210447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-host-slash\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.210580 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.210497 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7c9fffc5-b110-46de-8d20-728d29e23748-konnectivity-ca\") pod \"konnectivity-agent-w7qxr\" (UID: \"7c9fffc5-b110-46de-8d20-728d29e23748\") " pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.210580 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.210556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05421762-de84-4ca6-bedf-542c55460252-kubelet-config\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.211297 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.211232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-iptables-alerter-script\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.213650 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.213623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7c9fffc5-b110-46de-8d20-728d29e23748-agent-certs\") pod \"konnectivity-agent-w7qxr\" (UID: \"7c9fffc5-b110-46de-8d20-728d29e23748\") " pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.248224 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.248190 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:47.248224 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.248227 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:47.248495 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.248241 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:47.248495 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.248369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:37:47.748342083 +0000 UTC m=+3.260276104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:47.250267 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.250239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drrgf\" (UniqueName: \"kubernetes.io/projected/a3207bc4-cf48-4af8-b377-bbd83ed2fe47-kube-api-access-drrgf\") pod \"iptables-alerter-trfvm\" (UID: \"a3207bc4-cf48-4af8-b377-bbd83ed2fe47\") " pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.282645 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.282608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" Mar 12 13:37:47.290448 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.290417 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" Mar 12 13:37:47.291511 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.291450 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9288d2b3_d104_4c0d_8417_6d70d1c70098.slice/crio-fdb20bd80c537756763da55f280acd828666ada97ab75d11669a3c2b7a884cac WatchSource:0}: Error finding container fdb20bd80c537756763da55f280acd828666ada97ab75d11669a3c2b7a884cac: Status 404 returned error can't find the container with id fdb20bd80c537756763da55f280acd828666ada97ab75d11669a3c2b7a884cac Mar 12 13:37:47.298055 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.298028 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0068fb_0001_44e4_9194_485a2618370f.slice/crio-0a3658d1df4d503807763032ca08453b34e65b38680bdd29556e5ad3c0cbbab6 WatchSource:0}: Error finding container 0a3658d1df4d503807763032ca08453b34e65b38680bdd29556e5ad3c0cbbab6: Status 404 returned error can't find the container with id 0a3658d1df4d503807763032ca08453b34e65b38680bdd29556e5ad3c0cbbab6 Mar 12 13:37:47.299432 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.299402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gd9sq" Mar 12 13:37:47.303788 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.303758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:37:47.307225 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.307198 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3dfbf90_71a8_4aa2_a65c_3f9e522e8d39.slice/crio-26bd6425887d20c0acba4b356e5911a818ca67008289ae6bdfb7d7960c491f1b WatchSource:0}: Error finding container 26bd6425887d20c0acba4b356e5911a818ca67008289ae6bdfb7d7960c491f1b: Status 404 returned error can't find the container with id 26bd6425887d20c0acba4b356e5911a818ca67008289ae6bdfb7d7960c491f1b Mar 12 13:37:47.308754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.308722 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p28bf" Mar 12 13:37:47.314187 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.314169 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" Mar 12 13:37:47.314682 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.314653 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5825179_9ef4_493d_9fc5_0868f3084a95.slice/crio-9726f82246f3cc61ae71e9596ebd4caab4e8ccbdb20317c1e8e91f41e04d96c4 WatchSource:0}: Error finding container 9726f82246f3cc61ae71e9596ebd4caab4e8ccbdb20317c1e8e91f41e04d96c4: Status 404 returned error can't find the container with id 9726f82246f3cc61ae71e9596ebd4caab4e8ccbdb20317c1e8e91f41e04d96c4 Mar 12 13:37:47.318747 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.318719 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aca7d52_35bb_4dbf_8bb1_4a032145c336.slice/crio-2fcbcd31b8c38fb7e87b6dd852fb52c5399a7818c49967cace700ad0e366b964 WatchSource:0}: Error finding container 2fcbcd31b8c38fb7e87b6dd852fb52c5399a7818c49967cace700ad0e366b964: Status 404 returned error can't find the container with id 2fcbcd31b8c38fb7e87b6dd852fb52c5399a7818c49967cace700ad0e366b964 Mar 12 13:37:47.320039 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.320017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-trfvm" Mar 12 13:37:47.324728 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.324471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:37:47.325518 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.325480 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod990c57e8_518a_4bf2_b8ee_a937df89bcb3.slice/crio-a255cd113145f25ddb3f1f64ba6fb5191eed0d8b0a136dd20dcdefb5f74185be WatchSource:0}: Error finding container a255cd113145f25ddb3f1f64ba6fb5191eed0d8b0a136dd20dcdefb5f74185be: Status 404 returned error can't find the container with id a255cd113145f25ddb3f1f64ba6fb5191eed0d8b0a136dd20dcdefb5f74185be Mar 12 13:37:47.329717 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.329544 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3207bc4_cf48_4af8_b377_bbd83ed2fe47.slice/crio-c098d122d74ac1a22513d8e92e88e4648190557a01f419e926da4a3d12ed6782 WatchSource:0}: Error finding container c098d122d74ac1a22513d8e92e88e4648190557a01f419e926da4a3d12ed6782: Status 404 returned error can't find the container with id c098d122d74ac1a22513d8e92e88e4648190557a01f419e926da4a3d12ed6782 Mar 12 13:37:47.332437 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:37:47.332402 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c9fffc5_b110_46de_8d20_728d29e23748.slice/crio-352725c3cd134066170d75868027cc8c3ac767ad6cc35cdc9cca59e28429439c WatchSource:0}: Error finding container 352725c3cd134066170d75868027cc8c3ac767ad6cc35cdc9cca59e28429439c: Status 404 returned error can't find the container with id 352725c3cd134066170d75868027cc8c3ac767ad6cc35cdc9cca59e28429439c Mar 12 13:37:47.613268 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.613172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:47.613422 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.613313 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:47.613422 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.613387 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:37:48.613369877 +0000 UTC m=+4.125303898 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:47.714171 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.714130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:47.714338 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.714291 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:47.714379 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.714361 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:37:48.714346241 +0000 UTC m=+4.226280250 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:47.814997 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:47.814947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:47.815192 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.815150 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:47.815192 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.815177 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:47.815192 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.815188 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:47.815326 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:47.815245 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:37:48.815230113 +0000 UTC m=+4.327164119 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:48.033249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.033193 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-03-11 13:32:46 +0000 UTC" deadline="2027-11-03 13:52:40.420478322 +0000 UTC" Mar 12 13:37:48.033249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.033227 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14424h14m52.38725418s" Mar 12 13:37:48.091618 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.091271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerStarted","Data":"fdb20bd80c537756763da55f280acd828666ada97ab75d11669a3c2b7a884cac"} Mar 12 13:37:48.092808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.092774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w7qxr" event={"ID":"7c9fffc5-b110-46de-8d20-728d29e23748","Type":"ContainerStarted","Data":"352725c3cd134066170d75868027cc8c3ac767ad6cc35cdc9cca59e28429439c"} Mar 12 13:37:48.094932 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.094888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-trfvm" event={"ID":"a3207bc4-cf48-4af8-b377-bbd83ed2fe47","Type":"ContainerStarted","Data":"c098d122d74ac1a22513d8e92e88e4648190557a01f419e926da4a3d12ed6782"} Mar 12 13:37:48.096155 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.096105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gd9sq" event={"ID":"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39","Type":"ContainerStarted","Data":"26bd6425887d20c0acba4b356e5911a818ca67008289ae6bdfb7d7960c491f1b"} Mar 12 13:37:48.097426 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.097389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" event={"ID":"bf0068fb-0001-44e4-9194-485a2618370f","Type":"ContainerStarted","Data":"0a3658d1df4d503807763032ca08453b34e65b38680bdd29556e5ad3c0cbbab6"} Mar 12 13:37:48.099380 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.099354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" event={"ID":"6158045a7611177883f6a632f015315d","Type":"ContainerStarted","Data":"492d0f05efbcc1c302f576cebe9535f68af6729fef676fb7ed7729014dc6bc3a"} Mar 12 13:37:48.101873 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.101802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" event={"ID":"990c57e8-518a-4bf2-b8ee-a937df89bcb3","Type":"ContainerStarted","Data":"a255cd113145f25ddb3f1f64ba6fb5191eed0d8b0a136dd20dcdefb5f74185be"} Mar 12 13:37:48.104015 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.103990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p28bf" event={"ID":"1aca7d52-35bb-4dbf-8bb1-4a032145c336","Type":"ContainerStarted","Data":"2fcbcd31b8c38fb7e87b6dd852fb52c5399a7818c49967cace700ad0e366b964"} Mar 12 13:37:48.105262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.105234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"9726f82246f3cc61ae71e9596ebd4caab4e8ccbdb20317c1e8e91f41e04d96c4"} Mar 12 13:37:48.118066 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.118001 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-16.ec2.internal" podStartSLOduration=2.117972927 podStartE2EDuration="2.117972927s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:37:48.116684699 +0000 UTC m=+3.628618723" watchObservedRunningTime="2026-03-12 13:37:48.117972927 +0000 UTC m=+3.629906957" Mar 12 13:37:48.623076 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.623029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:48.623291 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.623242 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:48.623364 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.623311 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:37:50.623290349 +0000 UTC m=+6.135224353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:48.724034 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.723986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:48.724678 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.724214 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:48.724678 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.724284 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:37:50.724265899 +0000 UTC m=+6.236199908 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:48.825835 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:48.825433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:48.825835 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.825708 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:48.825835 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.825727 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:48.825835 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.825740 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:48.825835 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:48.825802 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:37:50.825782702 +0000 UTC m=+6.337716711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:49.084731 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:49.081431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:49.084731 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:49.081603 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:37:49.084731 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:49.084506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:49.084731 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:49.084626 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:49.088192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:49.086276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:49.088192 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:49.086419 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:50.641836 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:50.641800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:50.642306 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.641946 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:50.642306 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.642009 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:37:54.641989413 +0000 UTC m=+10.153923433 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:50.742819 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:50.742780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:50.742992 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.742926 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:50.742992 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.742990 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:37:54.742973405 +0000 UTC m=+10.254907424 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:50.843899 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:50.843857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:50.844083 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.844064 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:50.844124 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.844085 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:50.844124 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.844099 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:50.844185 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:50.844170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:37:54.844142776 +0000 UTC m=+10.356076800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:51.083565 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:51.082846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:51.083565 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:51.082985 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:51.083565 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:51.083402 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:51.083565 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:51.083536 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:37:51.083890 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:51.083626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:51.083890 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:51.083727 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:53.082058 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:53.082020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:53.082576 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:53.082156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:53.082658 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:53.082640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:53.082786 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:53.082766 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:37:53.082971 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:53.082942 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:53.083091 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:53.083060 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:54.674936 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:54.674893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:54.675403 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.675059 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:54.675403 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.675130 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:38:02.675108607 +0000 UTC m=+18.187042636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:37:54.775759 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:54.775724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:54.775999 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.775900 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:54.775999 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.775976 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:02.775955629 +0000 UTC m=+18.287889639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:37:54.876479 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:54.876364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:54.876690 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.876621 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:37:54.876690 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.876646 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:37:54.876690 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.876660 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:54.877152 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:54.876732 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:38:02.876710344 +0000 UTC m=+18.388644351 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:37:55.082328 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:55.082232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:55.082511 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:55.082334 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:55.082511 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:55.082418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:55.082628 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:55.082550 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:37:55.082628 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:55.082593 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:55.082762 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:55.082657 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:57.084966 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:57.084923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:57.085391 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:57.084991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:57.085391 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:57.085113 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:37:57.085623 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:57.085579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:57.088968 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:57.085715 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:57.088968 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:57.085841 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:59.081419 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:59.081379 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:37:59.081419 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:59.081407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:37:59.081942 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:37:59.081430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:37:59.081942 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:59.081532 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:37:59.081942 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:59.081697 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:37:59.081942 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:37:59.081796 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:01.084284 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:01.084258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:01.084777 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:01.084372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:01.084777 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:01.084380 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:01.084777 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:01.084519 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:01.084777 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:01.084573 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:01.084777 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:01.084655 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:02.738226 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:02.738179 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:02.738742 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.738337 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:02.738742 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.738412 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:38:18.738393378 +0000 UTC m=+34.250327406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:02.839312 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:02.839272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:02.839525 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.839444 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:38:02.839600 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.839549 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:18.839527657 +0000 UTC m=+34.351461662 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:38:02.940261 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:02.940221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:02.940434 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.940355 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:38:02.940434 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.940375 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:38:02.940434 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.940387 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:02.940571 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:02.940439 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:38:18.940426214 +0000 UTC m=+34.452360218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:03.084198 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:03.084121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:03.084198 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:03.084152 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:03.084410 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:03.084118 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:03.084410 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:03.084246 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:03.084410 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:03.084320 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:03.084410 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:03.084395 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:05.082300 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:05.082261 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:05.082906 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:05.082398 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:05.082906 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:05.082402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:05.082906 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:05.082488 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:05.082906 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:05.082596 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:05.082906 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:05.082690 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:06.142450 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.142189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gd9sq" event={"ID":"b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39","Type":"ContainerStarted","Data":"eaee507ad71c00bde6e2dba2e6b31a4b413806afff3d151bad84fd8a2b8fba36"} Mar 12 13:38:06.144017 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.143962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" event={"ID":"bf0068fb-0001-44e4-9194-485a2618370f","Type":"ContainerStarted","Data":"7837be8c570b416ad20a733a9fa75cca25dfa511f8eee3776870927074161411"} Mar 12 13:38:06.146561 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.146496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"51a90150188586960b96bf5455b706c5fdf75bbf84fd8aff008691a54422f554"} Mar 12 13:38:06.146561 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.146529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"22cd6ba3280fe5d9397b9454c5cf58cc861bc3fbadf478e96ea34b8f4da4ebe8"} Mar 12 13:38:06.146561 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.146543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"5c9b4c982e7830e14984d37f47004277cd97c2fcf4a535199f3aad96b3285197"} Mar 12 13:38:06.162406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.162359 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gd9sq" podStartSLOduration=2.491863342 podStartE2EDuration="21.162344708s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.309156003 +0000 UTC m=+2.821090015" lastFinishedPulling="2026-03-12 13:38:05.979637377 +0000 UTC m=+21.491571381" observedRunningTime="2026-03-12 13:38:06.161849548 +0000 UTC m=+21.673783576" watchObservedRunningTime="2026-03-12 13:38:06.162344708 +0000 UTC m=+21.674278734" Mar 12 13:38:06.181362 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:06.181316 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rjb6h" podStartSLOduration=2.729362841 podStartE2EDuration="21.181299488s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.299958429 +0000 UTC m=+2.811892442" lastFinishedPulling="2026-03-12 13:38:05.751895069 +0000 UTC m=+21.263829089" observedRunningTime="2026-03-12 13:38:06.181218893 +0000 UTC m=+21.693152919" watchObservedRunningTime="2026-03-12 13:38:06.181299488 +0000 UTC m=+21.693233516" Mar 12 13:38:07.084776 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.084745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:07.084944 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:07.084879 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:07.085291 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.085272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:07.085395 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:07.085361 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:07.085445 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.085433 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:07.085532 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:07.085515 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:07.149872 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.149834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-w7qxr" event={"ID":"7c9fffc5-b110-46de-8d20-728d29e23748","Type":"ContainerStarted","Data":"f116ff99dae6e1c7e5348385ee10302d69ff64376d1754bdea414cec8a823d43"} Mar 12 13:38:07.151155 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.151133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-trfvm" event={"ID":"a3207bc4-cf48-4af8-b377-bbd83ed2fe47","Type":"ContainerStarted","Data":"d10f00a6e51402bcdaa02b46de83fe45f285cb6140571a51601d27480489fc5c"} Mar 12 13:38:07.152410 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.152389 2576 generic.go:358] "Generic (PLEG): container finished" podID="6c70f15327caadb681531ec95696eaee" containerID="7d3b469eb15162a748cbdd3d431113089f93836e78863a2a949939221ea31acf" exitCode=0 Mar 12 13:38:07.152521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.152487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" event={"ID":"6c70f15327caadb681531ec95696eaee","Type":"ContainerDied","Data":"7d3b469eb15162a748cbdd3d431113089f93836e78863a2a949939221ea31acf"} Mar 12 13:38:07.153822 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.153802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" event={"ID":"990c57e8-518a-4bf2-b8ee-a937df89bcb3","Type":"ContainerStarted","Data":"ad45d3cfa9992e4f13c37924962d455755e4bed0380b03ca684743bd3077c2c8"} Mar 12 13:38:07.154966 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.154944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p28bf" event={"ID":"1aca7d52-35bb-4dbf-8bb1-4a032145c336","Type":"ContainerStarted","Data":"b808982a749172afb2449475e02591d9c0e9cd61e0c5f6f17ba0c4e5f254f33d"} Mar 12 13:38:07.157530 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.157509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"9cd4b91be46707f53215d2e381324fa60bf304f6dacbd6754018b8623147731f"} Mar 12 13:38:07.157619 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.157534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"9a4d956a92c0df948814212a44b069903c0915b63491e77a32667c0ff4619817"} Mar 12 13:38:07.157619 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.157543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"9e0d108b8f7047298cbd938ecb5aa9ffa98b357f61a1d3d8bccc05e92d8e7f2d"} Mar 12 13:38:07.158765 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.158747 2576 generic.go:358] "Generic (PLEG): container finished" podID="9288d2b3-d104-4c0d-8417-6d70d1c70098" containerID="07e84a69f00cff989d61839f8b37315852d30eb3efbef7f43e8e079e240c517b" exitCode=0 Mar 12 13:38:07.158826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.158808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerDied","Data":"07e84a69f00cff989d61839f8b37315852d30eb3efbef7f43e8e079e240c517b"} Mar 12 13:38:07.184549 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.184329 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-w7qxr" podStartSLOduration=3.766505338 podStartE2EDuration="22.184313795s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.334362782 +0000 UTC m=+2.846296794" lastFinishedPulling="2026-03-12 13:38:05.752171244 +0000 UTC m=+21.264105251" observedRunningTime="2026-03-12 13:38:07.169832814 +0000 UTC m=+22.681766842" watchObservedRunningTime="2026-03-12 13:38:07.184313795 +0000 UTC m=+22.696247822" Mar 12 13:38:07.217491 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.217416 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p28bf" podStartSLOduration=8.38567401 podStartE2EDuration="22.21739874s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.322022484 +0000 UTC m=+2.833956488" lastFinishedPulling="2026-03-12 13:38:01.153747198 +0000 UTC m=+16.665681218" observedRunningTime="2026-03-12 13:38:07.216882756 +0000 UTC m=+22.728816788" watchObservedRunningTime="2026-03-12 13:38:07.21739874 +0000 UTC m=+22.729332767" Mar 12 13:38:07.924518 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:07.924495 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Mar 12 13:38:08.081073 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.080918 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-03-12T13:38:07.924513163Z","UUID":"ff2fffe8-9164-422a-93ab-96598c8eb836","Handler":null,"Name":"","Endpoint":""} Mar 12 13:38:08.082587 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.082555 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Mar 12 13:38:08.082587 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.082586 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Mar 12 13:38:08.161854 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.161820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" event={"ID":"6c70f15327caadb681531ec95696eaee","Type":"ContainerStarted","Data":"31fb1d84d8f37b5f5a4c0f086f78cf921f3a70a2a6a423a37c2649e28b354828"} Mar 12 13:38:08.163473 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.163427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" event={"ID":"990c57e8-518a-4bf2-b8ee-a937df89bcb3","Type":"ContainerStarted","Data":"5c8fc77b5490241f735a5755e501eccf37fb015fd0f56cd13eedfd5e789966b5"} Mar 12 13:38:08.199352 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.199300 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-16.ec2.internal" podStartSLOduration=22.199282214 podStartE2EDuration="22.199282214s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:08.199107741 +0000 UTC m=+23.711041767" watchObservedRunningTime="2026-03-12 13:38:08.199282214 +0000 UTC m=+23.711216240" Mar 12 13:38:08.199600 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:08.199575 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-trfvm" podStartSLOduration=4.7790285059999995 podStartE2EDuration="23.199566669s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.331324386 +0000 UTC m=+2.843258390" lastFinishedPulling="2026-03-12 13:38:05.751862534 +0000 UTC m=+21.263796553" observedRunningTime="2026-03-12 13:38:07.232789196 +0000 UTC m=+22.744723226" watchObservedRunningTime="2026-03-12 13:38:08.199566669 +0000 UTC m=+23.711500696" Mar 12 13:38:09.081902 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:09.081437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:09.081902 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:09.081437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:09.081902 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:09.081574 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:09.081902 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:09.081638 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:09.081902 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:09.081696 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:09.081902 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:09.081812 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:09.172623 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:09.172553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"b857fb1c175a158a7924bc3a223b91f2a3759c46ca3e622ef4a3b3d85c9b41f2"} Mar 12 13:38:09.310245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:09.310156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:38:09.310867 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:09.310851 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:38:10.176630 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:10.176591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" event={"ID":"990c57e8-518a-4bf2-b8ee-a937df89bcb3","Type":"ContainerStarted","Data":"c8b9ac14c7b467b58b1701433814583d4d1acdaf4a83385b9dec4d73e15f5244"} Mar 12 13:38:10.177074 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:10.176796 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:38:10.177279 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:10.177263 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-w7qxr" Mar 12 13:38:10.195848 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:10.195801 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-f9rk6" podStartSLOduration=3.398800913 podStartE2EDuration="25.195786941s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.327964743 +0000 UTC m=+2.839898751" lastFinishedPulling="2026-03-12 13:38:09.124950759 +0000 UTC m=+24.636884779" observedRunningTime="2026-03-12 13:38:10.195668323 +0000 UTC m=+25.707602350" watchObservedRunningTime="2026-03-12 13:38:10.195786941 +0000 UTC m=+25.707720967" Mar 12 13:38:11.082260 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:11.082069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:11.082497 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:11.082330 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:11.082497 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:11.082069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:11.082497 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:11.082396 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:11.082497 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:11.082069 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:11.082497 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:11.082477 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:11.182264 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:11.182209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" event={"ID":"e5825179-9ef4-493d-9fc5-0868f3084a95","Type":"ContainerStarted","Data":"ce6ba9bf542d13220b3bd01889917e6238057ab7f6c618393ca5318cc4773919"} Mar 12 13:38:11.216388 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:11.216336 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" podStartSLOduration=7.720322284 podStartE2EDuration="26.21632167s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.31690159 +0000 UTC m=+2.828835610" lastFinishedPulling="2026-03-12 13:38:05.812900989 +0000 UTC m=+21.324834996" observedRunningTime="2026-03-12 13:38:11.216137822 +0000 UTC m=+26.728071850" watchObservedRunningTime="2026-03-12 13:38:11.21632167 +0000 UTC m=+26.728255726" Mar 12 13:38:12.185753 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.185720 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:38:12.186377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.185765 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:38:12.186377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.185775 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:38:12.204360 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.204300 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:38:12.204587 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.204550 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:38:12.484140 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.484046 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x4rlt"] Mar 12 13:38:12.484291 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.484158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:12.484291 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:12.484239 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:12.487598 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.487554 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wbtg8"] Mar 12 13:38:12.487720 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.487691 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:12.487816 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:12.487792 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:12.490245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.490214 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fnfhc"] Mar 12 13:38:12.490354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:12.490331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:12.490435 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:12.490419 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:14.081832 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:14.081801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:14.082316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:14.081801 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:14.082316 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:14.081899 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:14.082316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:14.081805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:14.082316 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:14.081953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:14.082316 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:14.082036 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:14.189976 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:14.189941 2576 generic.go:358] "Generic (PLEG): container finished" podID="9288d2b3-d104-4c0d-8417-6d70d1c70098" containerID="e513b696e83b200a65562e49cdd62e40cfb2d568c470802f8a713c372acfb8f9" exitCode=0 Mar 12 13:38:14.190160 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:14.190050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerDied","Data":"e513b696e83b200a65562e49cdd62e40cfb2d568c470802f8a713c372acfb8f9"} Mar 12 13:38:15.193550 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:15.193339 2576 generic.go:358] "Generic (PLEG): container finished" podID="9288d2b3-d104-4c0d-8417-6d70d1c70098" containerID="b43252918603fcec5b340cb0a636cd67b04286659e5f07b694d9015628e66b47" exitCode=0 Mar 12 13:38:15.193550 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:15.193432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerDied","Data":"b43252918603fcec5b340cb0a636cd67b04286659e5f07b694d9015628e66b47"} Mar 12 13:38:16.081342 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:16.081318 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:16.081538 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:16.081379 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:16.081538 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:16.081485 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:16.081538 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:16.081525 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:16.081657 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:16.081562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:16.081657 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:16.081620 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:16.197327 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:16.197241 2576 generic.go:358] "Generic (PLEG): container finished" podID="9288d2b3-d104-4c0d-8417-6d70d1c70098" containerID="a7c48ad41e2a86d855a0a7b4182318e6cfa6a23c33efd2ffda16d35121ffd119" exitCode=0 Mar 12 13:38:16.197327 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:16.197287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerDied","Data":"a7c48ad41e2a86d855a0a7b4182318e6cfa6a23c33efd2ffda16d35121ffd119"} Mar 12 13:38:18.081512 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.081476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:18.081921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.081515 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:18.081921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.081570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:18.081921 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.081662 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-x4rlt" podUID="05421762-de84-4ca6-bedf-542c55460252" Mar 12 13:38:18.081921 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.081812 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wbtg8" podUID="37e0d595-d2d4-4caf-8700-ffaeb5c3401f" Mar 12 13:38:18.081921 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.081886 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fnfhc" podUID="a2b57a5c-1736-4008-9bcc-41669382f70a" Mar 12 13:38:18.754281 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.754243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:18.754484 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.754422 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:18.754548 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.754523 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs podName:37e0d595-d2d4-4caf-8700-ffaeb5c3401f nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.754498022 +0000 UTC m=+66.266432050 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs") pod "network-metrics-daemon-wbtg8" (UID: "37e0d595-d2d4-4caf-8700-ffaeb5c3401f") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 13:38:18.822209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.822178 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-16.ec2.internal" event="NodeReady" Mar 12 13:38:18.822371 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.822362 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Mar 12 13:38:18.855371 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.855336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:18.855558 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.855477 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Mar 12 13:38:18.855558 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.855531 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret podName:05421762-de84-4ca6-bedf-542c55460252 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.855517672 +0000 UTC m=+66.367451690 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret") pod "global-pull-secret-syncer-x4rlt" (UID: "05421762-de84-4ca6-bedf-542c55460252") : object "kube-system"/"original-pull-secret" not registered Mar 12 13:38:18.863216 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.863192 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-658bf4bccb-bwzwj"] Mar 12 13:38:18.866760 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.866742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.867337 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.867314 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-f747dc4bb-qfvq7"] Mar 12 13:38:18.870870 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.870857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:18.883595 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.883572 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Mar 12 13:38:18.883595 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.883582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Mar 12 13:38:18.883759 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.883623 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Mar 12 13:38:18.886846 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.886830 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Mar 12 13:38:18.889938 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.889925 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Mar 12 13:38:18.903036 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.903012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Mar 12 13:38:18.922800 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.922766 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Mar 12 13:38:18.923434 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.923412 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7"] Mar 12 13:38:18.924276 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.924260 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Mar 12 13:38:18.925860 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.925845 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Mar 12 13:38:18.926083 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.926069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9d7nr\"" Mar 12 13:38:18.926329 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.926317 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-b6zd6\"" Mar 12 13:38:18.927139 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.927126 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv"] Mar 12 13:38:18.927312 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.927293 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" Mar 12 13:38:18.930491 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.930473 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq"] Mar 12 13:38:18.930598 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.930586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:18.933746 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.933731 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-q8flt"] Mar 12 13:38:18.933910 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.933893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:18.937022 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.937004 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:18.951374 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.951355 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5"] Mar 12 13:38:18.954716 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.954702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:18.956088 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-trusted-ca\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956164 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs8hw\" (UniqueName: \"kubernetes.io/projected/c11ab84b-7d66-4e49-a456-66c0ee53f865-kube-api-access-vs8hw\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:18.956164 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:18.956257 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:18.956257 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956257 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/327ec7eb-a5de-4709-9fbc-6bbe58d87674-ca-trust-extracted\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956257 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-installation-pull-secrets\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.956291 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.956312 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.956324 2576 projected.go:194] Error preparing data for projected volume kube-api-access-ztsv5 for pod openshift-network-diagnostics/network-check-target-fnfhc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-bound-sa-token\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956369 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-stats-auth\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:18.956379 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5 podName:a2b57a5c-1736-4008-9bcc-41669382f70a nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.956366356 +0000 UTC m=+66.468300364 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ztsv5" (UniqueName: "kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5") pod "network-check-target-fnfhc" (UID: "a2b57a5c-1736-4008-9bcc-41669382f70a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 13:38:18.956440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956437 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-certificates\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956810 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-image-registry-private-configuration\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956810 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cngd\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-kube-api-access-6cngd\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:18.956810 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.956609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-default-certificate\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:18.991153 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.991126 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-76b8565867-8c2br"] Mar 12 13:38:18.994674 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:18.994648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.001525 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.001497 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.001525 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.001510 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.001761 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.001533 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Mar 12 13:38:19.001761 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.001565 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.002867 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.002848 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-dx5pp"] Mar 12 13:38:19.003707 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.003674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.003819 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.003734 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.003921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.003907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Mar 12 13:38:19.003959 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.003918 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Mar 12 13:38:19.004260 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.004246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.004586 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.004528 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.004736 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.004723 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-g78qz\"" Mar 12 13:38:19.004978 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.004961 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Mar 12 13:38:19.005218 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005144 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.005218 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005202 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9dtlb\"" Mar 12 13:38:19.005218 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005214 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Mar 12 13:38:19.006012 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-j6pgg\"" Mar 12 13:38:19.006012 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005827 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.006012 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.006012 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.005907 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Mar 12 13:38:19.006262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006100 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Mar 12 13:38:19.006262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Mar 12 13:38:19.006262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006164 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-5v7ck\"" Mar 12 13:38:19.006262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006176 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Mar 12 13:38:19.006262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006240 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zncm5\"" Mar 12 13:38:19.006510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006290 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.006510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006178 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.006510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.006373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kxzct\"" Mar 12 13:38:19.007131 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.007115 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Mar 12 13:38:19.008301 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.008277 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m"] Mar 12 13:38:19.008718 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.008697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.015022 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.015001 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm"] Mar 12 13:38:19.015181 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.015162 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-j7ncg\"" Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-npwgp\"" Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018532 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018532 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018747 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq"] Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018768 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-658bf4bccb-bwzwj"] Mar 12 13:38:19.019282 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.018869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" Mar 12 13:38:19.020117 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.019673 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv"] Mar 12 13:38:19.020190 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.020168 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-q8flt"] Mar 12 13:38:19.020406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.020365 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.020505 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.020365 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Mar 12 13:38:19.020675 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.020659 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.021622 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.021602 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Mar 12 13:38:19.021784 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.021720 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-f747dc4bb-qfvq7"] Mar 12 13:38:19.021994 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.021967 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Mar 12 13:38:19.022774 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.022744 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Mar 12 13:38:19.025496 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.024882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.025496 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.025094 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.025496 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.025306 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-kt5ln\"" Mar 12 13:38:19.034862 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.034833 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm"] Mar 12 13:38:19.034997 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.034877 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-dx5pp"] Mar 12 13:38:19.034997 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.034890 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-8c2br"] Mar 12 13:38:19.035816 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.035795 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v9hjg"] Mar 12 13:38:19.039723 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.039703 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.041017 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.040999 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5"] Mar 12 13:38:19.048453 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.048422 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7"] Mar 12 13:38:19.050857 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.050839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.051954 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.051936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.052304 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.052287 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Mar 12 13:38:19.054561 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.054541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v9hjg"] Mar 12 13:38:19.054721 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.054709 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ddks7"] Mar 12 13:38:19.060582 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-certificates\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.060766 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-image-registry-private-configuration\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.060887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cngd\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-kube-api-access-6cngd\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.060887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-default-certificate\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.060887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6llcj\"" Mar 12 13:38:19.060887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-trusted-ca\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.060887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vs8hw\" (UniqueName: \"kubernetes.io/projected/c11ab84b-7d66-4e49-a456-66c0ee53f865-kube-api-access-vs8hw\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.060887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ecc6d8-57fe-48c1-bbcf-886289ff3155-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-config\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/16e1b1af-b744-44fd-a3d5-c817249f6767-snapshots\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrft2\" (UniqueName: \"kubernetes.io/projected/16e1b1af-b744-44fd-a3d5-c817249f6767-kube-api-access-rrft2\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx4c5\" (UniqueName: \"kubernetes.io/projected/8571836d-0cec-490b-a68b-8ecee11112d1-kube-api-access-wx4c5\") pod \"volume-data-source-validator-67fdcb5769-4j9p7\" (UID: \"8571836d-0cec-490b-a68b-8ecee11112d1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79l77\" (UniqueName: \"kubernetes.io/projected/20ecc6d8-57fe-48c1-bbcf-886289ff3155-kube-api-access-79l77\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frc5l\" (UniqueName: \"kubernetes.io/projected/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-kube-api-access-frc5l\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/327ec7eb-a5de-4709-9fbc-6bbe58d87674-ca-trust-extracted\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.061121 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.061120 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.561098301 +0000 UTC m=+35.073032308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : configmap references non-existent config key: service-ca.crt Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-installation-pull-secrets\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-certificates\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061219 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1b1af-b744-44fd-a3d5-c817249f6767-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh5jb\" (UniqueName: \"kubernetes.io/projected/f25d8f1b-e18e-4880-899f-71ae7bff54be-kube-api-access-nh5jb\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/327ec7eb-a5de-4709-9fbc-6bbe58d87674-ca-trust-extracted\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061335 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ecc6d8-57fe-48c1-bbcf-886289ff3155-config\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1b1af-b744-44fd-a3d5-c817249f6767-service-ca-bundle\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1b1af-b744-44fd-a3d5-c817249f6767-serving-cert\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.061408 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.061418 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf4bccb-bwzwj: secret "image-registry-tls" not found Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.061490 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls podName:327ec7eb-a5de-4709-9fbc-6bbe58d87674 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.561450673 +0000 UTC m=+35.073384678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls") pod "image-registry-658bf4bccb-bwzwj" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674") : secret "image-registry-tls" not found Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061413 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.061543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-bound-sa-token\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.062040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-stats-auth\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.062040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061587 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16e1b1af-b744-44fd-a3d5-c817249f6767-tmp\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.062040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061602 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.062040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.061613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f25d8f1b-e18e-4880-899f-71ae7bff54be-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.062190 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.060528 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m"] Mar 12 13:38:19.062430 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.062407 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 12 13:38:19.062527 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.062493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.562476758 +0000 UTC m=+35.074410762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : secret "router-metrics-certs-default" not found Mar 12 13:38:19.062938 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.062914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-trusted-ca\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.066949 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.066921 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Mar 12 13:38:19.067173 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.067155 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Mar 12 13:38:19.067527 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.067511 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4fw8j\"" Mar 12 13:38:19.067791 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.067520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-image-registry-private-configuration\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.067900 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.067546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-default-certificate\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.068154 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.068141 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Mar 12 13:38:19.068506 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.068336 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Mar 12 13:38:19.068904 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.068878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-installation-pull-secrets\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.073974 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.073736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-stats-auth\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.076328 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.076296 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ddks7"] Mar 12 13:38:19.081380 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.081347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cngd\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-kube-api-access-6cngd\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.087764 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.087740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs8hw\" (UniqueName: \"kubernetes.io/projected/c11ab84b-7d66-4e49-a456-66c0ee53f865-kube-api-access-vs8hw\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.089396 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.088854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-bound-sa-token\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.162185 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.162185 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wx4c5\" (UniqueName: \"kubernetes.io/projected/8571836d-0cec-490b-a68b-8ecee11112d1-kube-api-access-wx4c5\") pod \"volume-data-source-validator-67fdcb5769-4j9p7\" (UID: \"8571836d-0cec-490b-a68b-8ecee11112d1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nh5jb\" (UniqueName: \"kubernetes.io/projected/f25d8f1b-e18e-4880-899f-71ae7bff54be-kube-api-access-nh5jb\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12cccf55-aade-4101-9250-96d5b498a6af-nginx-conf\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thmlz\" (UniqueName: \"kubernetes.io/projected/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-kube-api-access-thmlz\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162368 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ecc6d8-57fe-48c1-bbcf-886289ff3155-config\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16e1b1af-b744-44fd-a3d5-c817249f6767-tmp\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6pj\" (UniqueName: \"kubernetes.io/projected/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-kube-api-access-rx6pj\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.162565 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drstb\" (UniqueName: \"kubernetes.io/projected/92914013-06d0-470c-b843-48a641e447a6-kube-api-access-drstb\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99423593-d38b-4662-bacc-b1f6bb3d0d90-tmp-dir\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.162622 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls podName:f25d8f1b-e18e-4880-899f-71ae7bff54be nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.66260676 +0000 UTC m=+35.174540778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-vw6kv" (UID: "f25d8f1b-e18e-4880-899f-71ae7bff54be") : secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/16e1b1af-b744-44fd-a3d5-c817249f6767-snapshots\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrft2\" (UniqueName: \"kubernetes.io/projected/16e1b1af-b744-44fd-a3d5-c817249f6767-kube-api-access-rrft2\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-config\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162843 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79l77\" (UniqueName: \"kubernetes.io/projected/20ecc6d8-57fe-48c1-bbcf-886289ff3155-kube-api-access-79l77\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frc5l\" (UniqueName: \"kubernetes.io/projected/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-kube-api-access-frc5l\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99423593-d38b-4662-bacc-b1f6bb3d0d90-config-volume\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1b1af-b744-44fd-a3d5-c817249f6767-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/16e1b1af-b744-44fd-a3d5-c817249f6767-tmp\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.162966 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjjkq\" (UniqueName: \"kubernetes.io/projected/99423593-d38b-4662-bacc-b1f6bb3d0d90-kube-api-access-xjjkq\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.163254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ecc6d8-57fe-48c1-bbcf-886289ff3155-config\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.163550 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/16e1b1af-b744-44fd-a3d5-c817249f6767-snapshots\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.163637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1b1af-b744-44fd-a3d5-c817249f6767-serving-cert\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.163923 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.163923 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1b1af-b744-44fd-a3d5-c817249f6767-service-ca-bundle\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.164387 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.163978 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.164387 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f25d8f1b-e18e-4880-899f-71ae7bff54be-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.164387 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.164387 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164310 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ecc6d8-57fe-48c1-bbcf-886289ff3155-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.164387 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-serving-cert\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.164387 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kskzl\" (UniqueName: \"kubernetes.io/projected/44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77-kube-api-access-kskzl\") pod \"network-check-source-cc88fdd44-z42dm\" (UID: \"44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" Mar 12 13:38:19.164698 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-config\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.164698 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-trusted-ca\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.164698 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.164668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1b1af-b744-44fd-a3d5-c817249f6767-service-ca-bundle\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.165071 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.165032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-config\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.165643 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.165614 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1b1af-b744-44fd-a3d5-c817249f6767-trusted-ca-bundle\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.165743 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.165702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/f25d8f1b-e18e-4880-899f-71ae7bff54be-telemetry-config\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.166675 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.166648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1b1af-b744-44fd-a3d5-c817249f6767-serving-cert\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.166929 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.166902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ecc6d8-57fe-48c1-bbcf-886289ff3155-serving-cert\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.167058 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.167041 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-serving-cert\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.172341 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.172316 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx4c5\" (UniqueName: \"kubernetes.io/projected/8571836d-0cec-490b-a68b-8ecee11112d1-kube-api-access-wx4c5\") pod \"volume-data-source-validator-67fdcb5769-4j9p7\" (UID: \"8571836d-0cec-490b-a68b-8ecee11112d1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" Mar 12 13:38:19.172711 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.172691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frc5l\" (UniqueName: \"kubernetes.io/projected/c2666fac-29e4-411f-a8eb-22c3b7d96cbf-kube-api-access-frc5l\") pod \"service-ca-operator-56f6f4cbcb-zklgq\" (UID: \"c2666fac-29e4-411f-a8eb-22c3b7d96cbf\") " pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.173390 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.173361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79l77\" (UniqueName: \"kubernetes.io/projected/20ecc6d8-57fe-48c1-bbcf-886289ff3155-kube-api-access-79l77\") pod \"kube-storage-version-migrator-operator-866f46547-wlnh5\" (UID: \"20ecc6d8-57fe-48c1-bbcf-886289ff3155\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.174405 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.174382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh5jb\" (UniqueName: \"kubernetes.io/projected/f25d8f1b-e18e-4880-899f-71ae7bff54be-kube-api-access-nh5jb\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.175920 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.175898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrft2\" (UniqueName: \"kubernetes.io/projected/16e1b1af-b744-44fd-a3d5-c817249f6767-kube-api-access-rrft2\") pod \"insights-operator-76bdd9f478-q8flt\" (UID: \"16e1b1af-b744-44fd-a3d5-c817249f6767\") " pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.236687 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.236651 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" Mar 12 13:38:19.248230 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.248205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" Mar 12 13:38:19.253306 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.253277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" Mar 12 13:38:19.262782 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.262479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" Mar 12 13:38:19.265769 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.265884 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12cccf55-aade-4101-9250-96d5b498a6af-nginx-conf\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.265884 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265818 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thmlz\" (UniqueName: \"kubernetes.io/projected/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-kube-api-access-thmlz\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.265884 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.266081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6pj\" (UniqueName: \"kubernetes.io/projected/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-kube-api-access-rx6pj\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.266081 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.265916 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:19.266081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-drstb\" (UniqueName: \"kubernetes.io/projected/92914013-06d0-470c-b843-48a641e447a6-kube-api-access-drstb\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.266081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.265975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99423593-d38b-4662-bacc-b1f6bb3d0d90-tmp-dir\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.266081 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.265997 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls podName:99423593-d38b-4662-bacc-b1f6bb3d0d90 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.765975404 +0000 UTC m=+35.277909424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls") pod "dns-default-ddks7" (UID: "99423593-d38b-4662-bacc-b1f6bb3d0d90") : secret "dns-default-metrics-tls" not found Mar 12 13:38:19.266081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.266043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-config\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.266515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.266117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99423593-d38b-4662-bacc-b1f6bb3d0d90-config-volume\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.266515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.266147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.266515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.266172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjjkq\" (UniqueName: \"kubernetes.io/projected/99423593-d38b-4662-bacc-b1f6bb3d0d90-kube-api-access-xjjkq\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.267536 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.267604 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-serving-cert\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.267604 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/99423593-d38b-4662-bacc-b1f6bb3d0d90-config-volume\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.267692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kskzl\" (UniqueName: \"kubernetes.io/projected/44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77-kube-api-access-kskzl\") pod \"network-check-source-cc88fdd44-z42dm\" (UID: \"44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" Mar 12 13:38:19.267692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-config\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.267692 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.266754 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 12 13:38:19.267692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-trusted-ca\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.267700 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls podName:f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.767682294 +0000 UTC m=+35.279616319 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-clm4m" (UID: "f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca") : secret "samples-operator-tls" not found Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.267180 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.267743 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert podName:92914013-06d0-470c-b843-48a641e447a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.767732838 +0000 UTC m=+35.279666848 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert") pod "ingress-canary-v9hjg" (UID: "92914013-06d0-470c-b843-48a641e447a6") : secret "canary-serving-cert" not found Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.266415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/99423593-d38b-4662-bacc-b1f6bb3d0d90-tmp-dir\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.267046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12cccf55-aade-4101-9250-96d5b498a6af-nginx-conf\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.267786 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 12 13:38:19.267843 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.267825 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert podName:12cccf55-aade-4101-9250-96d5b498a6af nodeName:}" failed. No retries permitted until 2026-03-12 13:38:19.767813049 +0000 UTC m=+35.279747070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-dx5pp" (UID: "12cccf55-aade-4101-9250-96d5b498a6af") : secret "networking-console-plugin-cert" not found Mar 12 13:38:19.268416 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.268369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-trusted-ca\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.271908 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.271725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-serving-cert\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.296599 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.296528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-drstb\" (UniqueName: \"kubernetes.io/projected/92914013-06d0-470c-b843-48a641e447a6-kube-api-access-drstb\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.301725 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.301662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kskzl\" (UniqueName: \"kubernetes.io/projected/44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77-kube-api-access-kskzl\") pod \"network-check-source-cc88fdd44-z42dm\" (UID: \"44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77\") " pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" Mar 12 13:38:19.301894 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.301802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thmlz\" (UniqueName: \"kubernetes.io/projected/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-kube-api-access-thmlz\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.303677 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.303551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjjkq\" (UniqueName: \"kubernetes.io/projected/99423593-d38b-4662-bacc-b1f6bb3d0d90-kube-api-access-xjjkq\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.306130 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.306100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6pj\" (UniqueName: \"kubernetes.io/projected/b87ad3f0-c310-4833-bfac-cd1e3b15a7d9-kube-api-access-rx6pj\") pod \"console-operator-76b8565867-8c2br\" (UID: \"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9\") " pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.391958 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.390540 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" Mar 12 13:38:19.435693 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.435669 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7"] Mar 12 13:38:19.436829 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.436789 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq"] Mar 12 13:38:19.437544 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:19.437514 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8571836d_0cec_490b_a68b_8ecee11112d1.slice/crio-a5804fa2cfef1e83db13e6e122ab7cb35db059ea655e7f952630881a534f40ea WatchSource:0}: Error finding container a5804fa2cfef1e83db13e6e122ab7cb35db059ea655e7f952630881a534f40ea: Status 404 returned error can't find the container with id a5804fa2cfef1e83db13e6e122ab7cb35db059ea655e7f952630881a534f40ea Mar 12 13:38:19.438562 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:19.438529 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2666fac_29e4_411f_a8eb_22c3b7d96cbf.slice/crio-1733e900414c4d879e008a85ed3aa0a6bf3a8588fef1e376863e7d41afd5c5e5 WatchSource:0}: Error finding container 1733e900414c4d879e008a85ed3aa0a6bf3a8588fef1e376863e7d41afd5c5e5: Status 404 returned error can't find the container with id 1733e900414c4d879e008a85ed3aa0a6bf3a8588fef1e376863e7d41afd5c5e5 Mar 12 13:38:19.456631 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.456517 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-76bdd9f478-q8flt"] Mar 12 13:38:19.456748 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.456658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5"] Mar 12 13:38:19.501175 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.500833 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lc7zn"] Mar 12 13:38:19.506427 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.506356 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.509138 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.509104 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lmr5h\"" Mar 12 13:38:19.552990 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.552950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm"] Mar 12 13:38:19.556581 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:19.556550 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44b94cdd_3ad4_4d6b_a6a2_2eedc6664c77.slice/crio-6647ba0f8aad9e5166b796c8b72155c6ecc7f13a01a3a3a2ada9d8b7920b8a92 WatchSource:0}: Error finding container 6647ba0f8aad9e5166b796c8b72155c6ecc7f13a01a3a3a2ada9d8b7920b8a92: Status 404 returned error can't find the container with id 6647ba0f8aad9e5166b796c8b72155c6ecc7f13a01a3a3a2ada9d8b7920b8a92 Mar 12 13:38:19.571190 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.571161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:19.571301 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.571219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.571371 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.571352 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 12 13:38:19.571429 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.571387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:19.571429 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.571418 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.571400376 +0000 UTC m=+36.083334382 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : secret "router-metrics-certs-default" not found Mar 12 13:38:19.571560 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.571446 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:19.571560 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.571486 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf4bccb-bwzwj: secret "image-registry-tls" not found Mar 12 13:38:19.571560 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.571496 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.571484246 +0000 UTC m=+36.083418256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : configmap references non-existent config key: service-ca.crt Mar 12 13:38:19.571560 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.571540 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls podName:327ec7eb-a5de-4709-9fbc-6bbe58d87674 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.571519983 +0000 UTC m=+36.083454003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls") pod "image-registry-658bf4bccb-bwzwj" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674") : secret "image-registry-tls" not found Mar 12 13:38:19.603678 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.603627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:19.672053 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.672024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:19.672195 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.672075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a746e42-a15c-4c61-b754-446d49beeae9-tmp-dir\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.672195 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.672098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-294m5\" (UniqueName: \"kubernetes.io/projected/8a746e42-a15c-4c61-b754-446d49beeae9-kube-api-access-294m5\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.672288 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.672226 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:19.672372 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.672299 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls podName:f25d8f1b-e18e-4880-899f-71ae7bff54be nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.672278246 +0000 UTC m=+36.184212267 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-vw6kv" (UID: "f25d8f1b-e18e-4880-899f-71ae7bff54be") : secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:19.672372 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.672357 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a746e42-a15c-4c61-b754-446d49beeae9-hosts-file\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.748145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.747968 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-76b8565867-8c2br"] Mar 12 13:38:19.750928 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:19.750898 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb87ad3f0_c310_4833_bfac_cd1e3b15a7d9.slice/crio-996cf000729fc91d0b5374bd266bbf8e203071d612be375f2eada4a677ac762d WatchSource:0}: Error finding container 996cf000729fc91d0b5374bd266bbf8e203071d612be375f2eada4a677ac762d: Status 404 returned error can't find the container with id 996cf000729fc91d0b5374bd266bbf8e203071d612be375f2eada4a677ac762d Mar 12 13:38:19.773124 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:19.773124 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a746e42-a15c-4c61-b754-446d49beeae9-hosts-file\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773171 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773246 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a746e42-a15c-4c61-b754-446d49beeae9-tmp-dir\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-294m5\" (UniqueName: \"kubernetes.io/projected/8a746e42-a15c-4c61-b754-446d49beeae9-kube-api-access-294m5\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773218 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773294 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:19.773330 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773327 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8a746e42-a15c-4c61-b754-446d49beeae9-hosts-file\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773352 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert podName:92914013-06d0-470c-b843-48a641e447a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.773332548 +0000 UTC m=+36.285266574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert") pod "ingress-canary-v9hjg" (UID: "92914013-06d0-470c-b843-48a641e447a6") : secret "canary-serving-cert" not found Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773283 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773381 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls podName:99423593-d38b-4662-bacc-b1f6bb3d0d90 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.773363999 +0000 UTC m=+36.285298012 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls") pod "dns-default-ddks7" (UID: "99423593-d38b-4662-bacc-b1f6bb3d0d90") : secret "dns-default-metrics-tls" not found Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773398 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls podName:f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.773389512 +0000 UTC m=+36.285323522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-clm4m" (UID: "f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca") : secret "samples-operator-tls" not found Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:19.773414 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert podName:12cccf55-aade-4101-9250-96d5b498a6af nodeName:}" failed. No retries permitted until 2026-03-12 13:38:20.773405665 +0000 UTC m=+36.285339689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-dx5pp" (UID: "12cccf55-aade-4101-9250-96d5b498a6af") : secret "networking-console-plugin-cert" not found Mar 12 13:38:19.773708 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.773580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8a746e42-a15c-4c61-b754-446d49beeae9-tmp-dir\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.785895 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.785867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-294m5\" (UniqueName: \"kubernetes.io/projected/8a746e42-a15c-4c61-b754-446d49beeae9-kube-api-access-294m5\") pod \"node-resolver-lc7zn\" (UID: \"8a746e42-a15c-4c61-b754-446d49beeae9\") " pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.819507 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:19.819470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lc7zn" Mar 12 13:38:19.827824 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:19.827788 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a746e42_a15c_4c61_b754_446d49beeae9.slice/crio-af65c8cbfa54ef9dc3e2047c47c040aedfc31382101e1080d6af09f90cc22e1b WatchSource:0}: Error finding container af65c8cbfa54ef9dc3e2047c47c040aedfc31382101e1080d6af09f90cc22e1b: Status 404 returned error can't find the container with id af65c8cbfa54ef9dc3e2047c47c040aedfc31382101e1080d6af09f90cc22e1b Mar 12 13:38:20.081581 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.081230 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:20.082640 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.081260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:20.082640 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.081288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:20.085450 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.085242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 12 13:38:20.085450 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.085345 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sj9kz\"" Mar 12 13:38:20.086720 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.086699 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 12 13:38:20.086843 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.086827 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbttq\"" Mar 12 13:38:20.206319 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.206263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" event={"ID":"16e1b1af-b744-44fd-a3d5-c817249f6767","Type":"ContainerStarted","Data":"cde72614b886a5c9ff6bb2237f9a61967b9dc61d1cdbf3a5c438f0cd14605237"} Mar 12 13:38:20.207571 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.207488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" event={"ID":"8571836d-0cec-490b-a68b-8ecee11112d1","Type":"ContainerStarted","Data":"a5804fa2cfef1e83db13e6e122ab7cb35db059ea655e7f952630881a534f40ea"} Mar 12 13:38:20.209443 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.209413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lc7zn" event={"ID":"8a746e42-a15c-4c61-b754-446d49beeae9","Type":"ContainerStarted","Data":"49bf013896dd0d9662b20c36d37ef8898f0ece4a5b1c801c29cde9eeced18fca"} Mar 12 13:38:20.209558 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.209447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lc7zn" event={"ID":"8a746e42-a15c-4c61-b754-446d49beeae9","Type":"ContainerStarted","Data":"af65c8cbfa54ef9dc3e2047c47c040aedfc31382101e1080d6af09f90cc22e1b"} Mar 12 13:38:20.211293 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.211260 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-8c2br" event={"ID":"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9","Type":"ContainerStarted","Data":"996cf000729fc91d0b5374bd266bbf8e203071d612be375f2eada4a677ac762d"} Mar 12 13:38:20.212504 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.212437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" event={"ID":"44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77","Type":"ContainerStarted","Data":"6647ba0f8aad9e5166b796c8b72155c6ecc7f13a01a3a3a2ada9d8b7920b8a92"} Mar 12 13:38:20.213652 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.213629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" event={"ID":"20ecc6d8-57fe-48c1-bbcf-886289ff3155","Type":"ContainerStarted","Data":"7586bdd66912cb91c50f59e34c53c0126359b0654cbb516371f66c973051c543"} Mar 12 13:38:20.214766 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.214742 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" event={"ID":"c2666fac-29e4-411f-a8eb-22c3b7d96cbf","Type":"ContainerStarted","Data":"1733e900414c4d879e008a85ed3aa0a6bf3a8588fef1e376863e7d41afd5c5e5"} Mar 12 13:38:20.227170 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.227131 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lc7zn" podStartSLOduration=1.227116234 podStartE2EDuration="1.227116234s" podCreationTimestamp="2026-03-12 13:38:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:20.226950836 +0000 UTC m=+35.738884864" watchObservedRunningTime="2026-03-12 13:38:20.227116234 +0000 UTC m=+35.739050258" Mar 12 13:38:20.581810 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.581771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:20.582020 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.581830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:20.582020 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.582002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:20.582830 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.582180 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.58215993 +0000 UTC m=+38.094093955 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : configmap references non-existent config key: service-ca.crt Mar 12 13:38:20.582830 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.582631 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:20.582830 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.582644 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf4bccb-bwzwj: secret "image-registry-tls" not found Mar 12 13:38:20.582830 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.582685 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls podName:327ec7eb-a5de-4709-9fbc-6bbe58d87674 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.582665681 +0000 UTC m=+38.094599686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls") pod "image-registry-658bf4bccb-bwzwj" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674") : secret "image-registry-tls" not found Mar 12 13:38:20.582830 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.582728 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 12 13:38:20.582830 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.582749 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.582742995 +0000 UTC m=+38.094676999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : secret "router-metrics-certs-default" not found Mar 12 13:38:20.683007 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.682878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:20.683479 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.683301 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:20.684001 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.683923 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls podName:f25d8f1b-e18e-4880-899f-71ae7bff54be nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.683892131 +0000 UTC m=+38.195826142 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-vw6kv" (UID: "f25d8f1b-e18e-4880-899f-71ae7bff54be") : secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.784502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.784581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.784704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:20.784755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.784930 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.784995 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert podName:12cccf55-aade-4101-9250-96d5b498a6af nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.7849759 +0000 UTC m=+38.296909911 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-dx5pp" (UID: "12cccf55-aade-4101-9250-96d5b498a6af") : secret "networking-console-plugin-cert" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.785355 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.785390 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls podName:99423593-d38b-4662-bacc-b1f6bb3d0d90 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.785380004 +0000 UTC m=+38.297314008 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls") pod "dns-default-ddks7" (UID: "99423593-d38b-4662-bacc-b1f6bb3d0d90") : secret "dns-default-metrics-tls" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.785435 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.785476 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls podName:f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.785452486 +0000 UTC m=+38.297386491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-clm4m" (UID: "f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca") : secret "samples-operator-tls" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.785524 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:20.785610 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:20.785564 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert podName:92914013-06d0-470c-b843-48a641e447a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:22.785553862 +0000 UTC m=+38.297487868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert") pod "ingress-canary-v9hjg" (UID: "92914013-06d0-470c-b843-48a641e447a6") : secret "canary-serving-cert" not found Mar 12 13:38:22.605716 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.605669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.605753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.605793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.605937 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.605961 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf4bccb-bwzwj: secret "image-registry-tls" not found Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.605986 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.605967083 +0000 UTC m=+42.117901092 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : configmap references non-existent config key: service-ca.crt Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.606015 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls podName:327ec7eb-a5de-4709-9fbc-6bbe58d87674 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.605999643 +0000 UTC m=+42.117933649 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls") pod "image-registry-658bf4bccb-bwzwj" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674") : secret "image-registry-tls" not found Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.606055 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 12 13:38:22.606162 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.606093 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.606083388 +0000 UTC m=+42.118017393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : secret "router-metrics-certs-default" not found Mar 12 13:38:22.707345 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.707297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:22.707540 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.707486 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:22.707598 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.707553 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls podName:f25d8f1b-e18e-4880-899f-71ae7bff54be nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.707537669 +0000 UTC m=+42.219471677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-vw6kv" (UID: "f25d8f1b-e18e-4880-899f-71ae7bff54be") : secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:22.808244 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.808211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:22.808426 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.808265 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:22.808426 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.808349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:22.808426 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808362 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:22.808426 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:22.808403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:22.808426 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808426 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert podName:92914013-06d0-470c-b843-48a641e447a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.808406379 +0000 UTC m=+42.320340398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert") pod "ingress-canary-v9hjg" (UID: "92914013-06d0-470c-b843-48a641e447a6") : secret "canary-serving-cert" not found Mar 12 13:38:22.808689 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808434 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 12 13:38:22.808689 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808487 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 12 13:38:22.808689 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808511 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert podName:12cccf55-aade-4101-9250-96d5b498a6af nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.808498227 +0000 UTC m=+42.320432233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-dx5pp" (UID: "12cccf55-aade-4101-9250-96d5b498a6af") : secret "networking-console-plugin-cert" not found Mar 12 13:38:22.808689 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808532 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls podName:f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.808526011 +0000 UTC m=+42.320460016 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-clm4m" (UID: "f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca") : secret "samples-operator-tls" not found Mar 12 13:38:22.808689 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808558 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:22.808689 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:22.808587 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls podName:99423593-d38b-4662-bacc-b1f6bb3d0d90 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:26.808575344 +0000 UTC m=+42.320509371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls") pod "dns-default-ddks7" (UID: "99423593-d38b-4662-bacc-b1f6bb3d0d90") : secret "dns-default-metrics-tls" not found Mar 12 13:38:26.643762 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.643720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.643796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.643827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.643873 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.643851197 +0000 UTC m=+50.155785295 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : configmap references non-existent config key: service-ca.crt Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.643943 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.643957 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.643978 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf4bccb-bwzwj: secret "image-registry-tls" not found Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.643982 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.643969534 +0000 UTC m=+50.155903540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : secret "router-metrics-certs-default" not found Mar 12 13:38:26.644222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.644029 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls podName:327ec7eb-a5de-4709-9fbc-6bbe58d87674 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.644013716 +0000 UTC m=+50.155947721 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls") pod "image-registry-658bf4bccb-bwzwj" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674") : secret "image-registry-tls" not found Mar 12 13:38:26.744754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.744710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:26.744754 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.744748 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:26.745013 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.744830 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls podName:f25d8f1b-e18e-4880-899f-71ae7bff54be nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.744809214 +0000 UTC m=+50.256743233 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-vw6kv" (UID: "f25d8f1b-e18e-4880-899f-71ae7bff54be") : secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:26.845998 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.845956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:26.846201 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.846089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:26.846201 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.846121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:26.846201 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846128 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 12 13:38:26.846201 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:26.846192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:26.846399 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846220 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls podName:f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.84620062 +0000 UTC m=+50.358134625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-clm4m" (UID: "f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca") : secret "samples-operator-tls" not found Mar 12 13:38:26.846399 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846294 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:26.846399 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846346 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls podName:99423593-d38b-4662-bacc-b1f6bb3d0d90 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.846329342 +0000 UTC m=+50.358263352 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls") pod "dns-default-ddks7" (UID: "99423593-d38b-4662-bacc-b1f6bb3d0d90") : secret "dns-default-metrics-tls" not found Mar 12 13:38:26.846593 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846402 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:26.846593 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846433 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert podName:92914013-06d0-470c-b843-48a641e447a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.846422969 +0000 UTC m=+50.358356976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert") pod "ingress-canary-v9hjg" (UID: "92914013-06d0-470c-b843-48a641e447a6") : secret "canary-serving-cert" not found Mar 12 13:38:26.846593 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846509 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 12 13:38:26.846593 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:26.846545 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert podName:12cccf55-aade-4101-9250-96d5b498a6af nodeName:}" failed. No retries permitted until 2026-03-12 13:38:34.846533782 +0000 UTC m=+50.358467801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-dx5pp" (UID: "12cccf55-aade-4101-9250-96d5b498a6af") : secret "networking-console-plugin-cert" not found Mar 12 13:38:29.238491 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.237908 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/0.log" Mar 12 13:38:29.238491 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.237953 2576 generic.go:358] "Generic (PLEG): container finished" podID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" containerID="64f40cf0918b903c588810df345695e314b3e335d5c7591a8f8754723ff8d481" exitCode=255 Mar 12 13:38:29.238491 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.238016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-8c2br" event={"ID":"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9","Type":"ContainerDied","Data":"64f40cf0918b903c588810df345695e314b3e335d5c7591a8f8754723ff8d481"} Mar 12 13:38:29.238491 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.238371 2576 scope.go:117] "RemoveContainer" containerID="64f40cf0918b903c588810df345695e314b3e335d5c7591a8f8754723ff8d481" Mar 12 13:38:29.240122 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.240084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" event={"ID":"44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77","Type":"ContainerStarted","Data":"90d7bb62d1631fb978d09a7cb77fe3a7d85ddc44bb2625fc78a4f401f4c3bff8"} Mar 12 13:38:29.242653 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.242625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" event={"ID":"20ecc6d8-57fe-48c1-bbcf-886289ff3155","Type":"ContainerStarted","Data":"b2ccf4ecbe41c6550b3914f1aaa5845b022a8ea3e90c617ff7c4f50efff2e121"} Mar 12 13:38:29.244040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.244016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" event={"ID":"c2666fac-29e4-411f-a8eb-22c3b7d96cbf","Type":"ContainerStarted","Data":"5e3ae8dc965c47594038e41dc486439c180e32c02165fb1a0f8258ba07b89ffc"} Mar 12 13:38:29.245795 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.245754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" event={"ID":"16e1b1af-b744-44fd-a3d5-c817249f6767","Type":"ContainerStarted","Data":"03b5be3f7da3c73290aeeb86c3976e9b87b970da5782e24dc82f797141d12785"} Mar 12 13:38:29.250769 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.249545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" event={"ID":"8571836d-0cec-490b-a68b-8ecee11112d1","Type":"ContainerStarted","Data":"555da0ddb1c1f735a552711c83fec753e2845c85957ae81350e4bf51ae4835fb"} Mar 12 13:38:29.273617 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.273573 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" podStartSLOduration=16.737470868 podStartE2EDuration="26.273557425s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:19.441005501 +0000 UTC m=+34.952939505" lastFinishedPulling="2026-03-12 13:38:28.977092042 +0000 UTC m=+44.489026062" observedRunningTime="2026-03-12 13:38:29.272300917 +0000 UTC m=+44.784234944" watchObservedRunningTime="2026-03-12 13:38:29.273557425 +0000 UTC m=+44.785491451" Mar 12 13:38:29.313072 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.311928 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-cc88fdd44-z42dm" podStartSLOduration=16.899789755 podStartE2EDuration="26.311908712s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:19.559117439 +0000 UTC m=+35.071051444" lastFinishedPulling="2026-03-12 13:38:28.971236393 +0000 UTC m=+44.483170401" observedRunningTime="2026-03-12 13:38:29.289649804 +0000 UTC m=+44.801583832" watchObservedRunningTime="2026-03-12 13:38:29.311908712 +0000 UTC m=+44.823842741" Mar 12 13:38:29.313072 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.312248 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-67fdcb5769-4j9p7" podStartSLOduration=16.781036792 podStartE2EDuration="26.312238613s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:19.439992415 +0000 UTC m=+34.951926419" lastFinishedPulling="2026-03-12 13:38:28.971194232 +0000 UTC m=+44.483128240" observedRunningTime="2026-03-12 13:38:29.310907445 +0000 UTC m=+44.822841473" watchObservedRunningTime="2026-03-12 13:38:29.312238613 +0000 UTC m=+44.824172642" Mar 12 13:38:29.333245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.333192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" podStartSLOduration=16.820654939 podStartE2EDuration="26.333172308s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:19.458388435 +0000 UTC m=+34.970322442" lastFinishedPulling="2026-03-12 13:38:28.970905805 +0000 UTC m=+44.482839811" observedRunningTime="2026-03-12 13:38:29.331811877 +0000 UTC m=+44.843745905" watchObservedRunningTime="2026-03-12 13:38:29.333172308 +0000 UTC m=+44.845106337" Mar 12 13:38:29.361987 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.360409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" podStartSLOduration=16.849917253 podStartE2EDuration="26.360394178s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:19.460752325 +0000 UTC m=+34.972686333" lastFinishedPulling="2026-03-12 13:38:28.971229253 +0000 UTC m=+44.483163258" observedRunningTime="2026-03-12 13:38:29.360157764 +0000 UTC m=+44.872091792" watchObservedRunningTime="2026-03-12 13:38:29.360394178 +0000 UTC m=+44.872328206" Mar 12 13:38:29.604687 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.604648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:29.604687 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:29.604691 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:30.255341 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.255305 2576 generic.go:358] "Generic (PLEG): container finished" podID="9288d2b3-d104-4c0d-8417-6d70d1c70098" containerID="dc084b4c417d5f71086b357c9133baea82c7651d9cf539c10242ec71f2ba7677" exitCode=0 Mar 12 13:38:30.255830 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.255390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerDied","Data":"dc084b4c417d5f71086b357c9133baea82c7651d9cf539c10242ec71f2ba7677"} Mar 12 13:38:30.257373 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.257354 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/1.log" Mar 12 13:38:30.257864 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.257847 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/0.log" Mar 12 13:38:30.257981 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.257879 2576 generic.go:358] "Generic (PLEG): container finished" podID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" containerID="fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec" exitCode=255 Mar 12 13:38:30.258533 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.258517 2576 scope.go:117] "RemoveContainer" containerID="fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec" Mar 12 13:38:30.258728 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:30.258699 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-8c2br_openshift-console-operator(b87ad3f0-c310-4833-bfac-cd1e3b15a7d9)\"" pod="openshift-console-operator/console-operator-76b8565867-8c2br" podUID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" Mar 12 13:38:30.258980 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.258945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-8c2br" event={"ID":"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9","Type":"ContainerDied","Data":"fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec"} Mar 12 13:38:30.259129 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:30.258999 2576 scope.go:117] "RemoveContainer" containerID="64f40cf0918b903c588810df345695e314b3e335d5c7591a8f8754723ff8d481" Mar 12 13:38:31.262046 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:31.262011 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/1.log" Mar 12 13:38:31.262543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:31.262432 2576 scope.go:117] "RemoveContainer" containerID="fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec" Mar 12 13:38:31.262676 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:31.262655 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-8c2br_openshift-console-operator(b87ad3f0-c310-4833-bfac-cd1e3b15a7d9)\"" pod="openshift-console-operator/console-operator-76b8565867-8c2br" podUID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" Mar 12 13:38:31.264908 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:31.264881 2576 generic.go:358] "Generic (PLEG): container finished" podID="9288d2b3-d104-4c0d-8417-6d70d1c70098" containerID="004a1107d4432c0a69662a05a4243685d66b16e65ef823c9738114820fc1222e" exitCode=0 Mar 12 13:38:31.265093 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:31.264919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerDied","Data":"004a1107d4432c0a69662a05a4243685d66b16e65ef823c9738114820fc1222e"} Mar 12 13:38:32.271296 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:32.271264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" event={"ID":"9288d2b3-d104-4c0d-8417-6d70d1c70098","Type":"ContainerStarted","Data":"d822624c4924b6e8a94edefd440a68313e7133a6fecaf130c6d2b5c4bab50729"} Mar 12 13:38:32.271729 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:32.271668 2576 scope.go:117] "RemoveContainer" containerID="fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec" Mar 12 13:38:32.271840 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:32.271822 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-76b8565867-8c2br_openshift-console-operator(b87ad3f0-c310-4833-bfac-cd1e3b15a7d9)\"" pod="openshift-console-operator/console-operator-76b8565867-8c2br" podUID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" Mar 12 13:38:32.296269 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:32.296206 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6wcbt" podStartSLOduration=5.619264138 podStartE2EDuration="47.296187885s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:37:47.294100178 +0000 UTC m=+2.806034182" lastFinishedPulling="2026-03-12 13:38:28.97102391 +0000 UTC m=+44.482957929" observedRunningTime="2026-03-12 13:38:32.293685298 +0000 UTC m=+47.805619326" watchObservedRunningTime="2026-03-12 13:38:32.296187885 +0000 UTC m=+47.808121913" Mar 12 13:38:33.265217 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:33.265188 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lc7zn_8a746e42-a15c-4c61-b754-446d49beeae9/dns-node-resolver/0.log" Mar 12 13:38:34.054346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.054313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p28bf_1aca7d52-35bb-4dbf-8bb1-4a032145c336/node-ca/0.log" Mar 12 13:38:34.725145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.725103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:34.725311 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.725162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:34.725311 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.725186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:34.725311 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.725286 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.725267862 +0000 UTC m=+66.237201870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : configmap references non-existent config key: service-ca.crt Mar 12 13:38:34.725311 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.725298 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Mar 12 13:38:34.725540 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.725317 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-658bf4bccb-bwzwj: secret "image-registry-tls" not found Mar 12 13:38:34.725540 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.725295 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Mar 12 13:38:34.725540 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.725378 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls podName:327ec7eb-a5de-4709-9fbc-6bbe58d87674 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.72536002 +0000 UTC m=+66.237294074 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls") pod "image-registry-658bf4bccb-bwzwj" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674") : secret "image-registry-tls" not found Mar 12 13:38:34.725540 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.725403 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs podName:c11ab84b-7d66-4e49-a456-66c0ee53f865 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.725396474 +0000 UTC m=+66.237330479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs") pod "router-default-f747dc4bb-qfvq7" (UID: "c11ab84b-7d66-4e49-a456-66c0ee53f865") : secret "router-metrics-certs-default" not found Mar 12 13:38:34.826533 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.826493 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:34.826701 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.826638 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:34.826741 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.826705 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls podName:f25d8f1b-e18e-4880-899f-71ae7bff54be nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.826689115 +0000 UTC m=+66.338623124 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-b58cd5d8d-vw6kv" (UID: "f25d8f1b-e18e-4880-899f-71ae7bff54be") : secret "cluster-monitoring-operator-tls" not found Mar 12 13:38:34.928006 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.927965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:34.928209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.928016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:34.928209 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928121 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 12 13:38:34.928209 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928132 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 12 13:38:34.928209 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.928176 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:34.928209 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928191 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert podName:12cccf55-aade-4101-9250-96d5b498a6af nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.92817277 +0000 UTC m=+66.440106778 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert") pod "networking-console-plugin-55b77584bb-dx5pp" (UID: "12cccf55-aade-4101-9250-96d5b498a6af") : secret "networking-console-plugin-cert" not found Mar 12 13:38:34.928209 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928212 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert podName:92914013-06d0-470c-b843-48a641e447a6 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.928204001 +0000 UTC m=+66.440138005 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert") pod "ingress-canary-v9hjg" (UID: "92914013-06d0-470c-b843-48a641e447a6") : secret "canary-serving-cert" not found Mar 12 13:38:34.928542 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928249 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 12 13:38:34.928542 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:34.928270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:34.928542 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928284 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls podName:99423593-d38b-4662-bacc-b1f6bb3d0d90 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.928270614 +0000 UTC m=+66.440204618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls") pod "dns-default-ddks7" (UID: "99423593-d38b-4662-bacc-b1f6bb3d0d90") : secret "dns-default-metrics-tls" not found Mar 12 13:38:34.928542 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928365 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Mar 12 13:38:34.928542 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:34.928413 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls podName:f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca nodeName:}" failed. No retries permitted until 2026-03-12 13:38:50.928400839 +0000 UTC m=+66.440334849 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls") pod "cluster-samples-operator-d5df4776c-clm4m" (UID: "f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca") : secret "samples-operator-tls" not found Mar 12 13:38:35.212966 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.212931 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rpc6g"] Mar 12 13:38:35.244548 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.244520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpc6g"] Mar 12 13:38:35.244710 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.244670 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.247728 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.247703 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Mar 12 13:38:35.247885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.247740 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Mar 12 13:38:35.247960 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.247948 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4sgvv\"" Mar 12 13:38:35.332755 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.332716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/035b3ed4-c514-4d2a-a1a9-e485a1c21535-crio-socket\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.332912 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.332773 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/035b3ed4-c514-4d2a-a1a9-e485a1c21535-data-volume\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.332912 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.332834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/035b3ed4-c514-4d2a-a1a9-e485a1c21535-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.333008 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.332962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.333043 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.333015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4hs4\" (UniqueName: \"kubernetes.io/projected/035b3ed4-c514-4d2a-a1a9-e485a1c21535-kube-api-access-s4hs4\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434363 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4hs4\" (UniqueName: \"kubernetes.io/projected/035b3ed4-c514-4d2a-a1a9-e485a1c21535-kube-api-access-s4hs4\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434363 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/035b3ed4-c514-4d2a-a1a9-e485a1c21535-crio-socket\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434363 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/035b3ed4-c514-4d2a-a1a9-e485a1c21535-data-volume\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434363 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/035b3ed4-c514-4d2a-a1a9-e485a1c21535-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434363 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:35.434300 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:35.434615 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:35.434394 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls podName:035b3ed4-c514-4d2a-a1a9-e485a1c21535 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:35.934373723 +0000 UTC m=+51.446307748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpc6g" (UID: "035b3ed4-c514-4d2a-a1a9-e485a1c21535") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:35.434615 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/035b3ed4-c514-4d2a-a1a9-e485a1c21535-crio-socket\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434615 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434604 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/035b3ed4-c514-4d2a-a1a9-e485a1c21535-data-volume\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.434877 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.434856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/035b3ed4-c514-4d2a-a1a9-e485a1c21535-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.448350 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.448324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4hs4\" (UniqueName: \"kubernetes.io/projected/035b3ed4-c514-4d2a-a1a9-e485a1c21535-kube-api-access-s4hs4\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.939799 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:35.939753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:35.939991 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:35.939900 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:35.939991 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:35.939971 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls podName:035b3ed4-c514-4d2a-a1a9-e485a1c21535 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:36.93995528 +0000 UTC m=+52.451889315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpc6g" (UID: "035b3ed4-c514-4d2a-a1a9-e485a1c21535") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:36.949064 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:36.949018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:36.949546 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:36.949175 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:36.949546 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:36.949246 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls podName:035b3ed4-c514-4d2a-a1a9-e485a1c21535 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:38.949229591 +0000 UTC m=+54.461163600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpc6g" (UID: "035b3ed4-c514-4d2a-a1a9-e485a1c21535") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:38.966658 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:38.966614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:38.967123 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:38.966772 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:38.967123 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:38.966889 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls podName:035b3ed4-c514-4d2a-a1a9-e485a1c21535 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:42.966869959 +0000 UTC m=+58.478803964 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpc6g" (UID: "035b3ed4-c514-4d2a-a1a9-e485a1c21535") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:39.604734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:39.604681 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:39.604734 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:39.604736 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:39.605141 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:39.605127 2576 scope.go:117] "RemoveContainer" containerID="fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec" Mar 12 13:38:40.298030 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:40.298006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:38:40.298429 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:40.298353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/1.log" Mar 12 13:38:40.298429 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:40.298384 2576 generic.go:358] "Generic (PLEG): container finished" podID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" containerID="0f4e642337fa5fad845cb0638c733f4d88216bea18b707838b95a6bd804179af" exitCode=255 Mar 12 13:38:40.298542 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:40.298473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-8c2br" event={"ID":"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9","Type":"ContainerDied","Data":"0f4e642337fa5fad845cb0638c733f4d88216bea18b707838b95a6bd804179af"} Mar 12 13:38:40.298542 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:40.298512 2576 scope.go:117] "RemoveContainer" containerID="fb00c9c70d1d7d68efd5394da14835568db59ea5ee618ec184372feb6e0206ec" Mar 12 13:38:40.298868 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:40.298853 2576 scope.go:117] "RemoveContainer" containerID="0f4e642337fa5fad845cb0638c733f4d88216bea18b707838b95a6bd804179af" Mar 12 13:38:40.299071 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:40.299053 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-8c2br_openshift-console-operator(b87ad3f0-c310-4833-bfac-cd1e3b15a7d9)\"" pod="openshift-console-operator/console-operator-76b8565867-8c2br" podUID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" Mar 12 13:38:41.302817 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:41.302790 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:38:43.006832 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:43.006786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:43.007222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:43.006946 2576 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Mar 12 13:38:43.007222 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:43.007018 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls podName:035b3ed4-c514-4d2a-a1a9-e485a1c21535 nodeName:}" failed. No retries permitted until 2026-03-12 13:38:51.007002093 +0000 UTC m=+66.518936097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls") pod "insights-runtime-extractor-rpc6g" (UID: "035b3ed4-c514-4d2a-a1a9-e485a1c21535") : secret "insights-runtime-extractor-tls" not found Mar 12 13:38:44.201801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:44.201767 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8qtnl" Mar 12 13:38:49.604434 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:49.604380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:49.604434 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:49.604441 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:38:49.605002 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:49.604814 2576 scope.go:117] "RemoveContainer" containerID="0f4e642337fa5fad845cb0638c733f4d88216bea18b707838b95a6bd804179af" Mar 12 13:38:49.605042 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:38:49.604999 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-76b8565867-8c2br_openshift-console-operator(b87ad3f0-c310-4833-bfac-cd1e3b15a7d9)\"" pod="openshift-console-operator/console-operator-76b8565867-8c2br" podUID="b87ad3f0-c310-4833-bfac-cd1e3b15a7d9" Mar 12 13:38:50.783722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.783673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:50.784210 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.783844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:50.784210 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.783903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:50.784210 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.783936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:50.784578 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.784557 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c11ab84b-7d66-4e49-a456-66c0ee53f865-service-ca-bundle\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:50.786365 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.786342 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c11ab84b-7d66-4e49-a456-66c0ee53f865-metrics-certs\") pod \"router-default-f747dc4bb-qfvq7\" (UID: \"c11ab84b-7d66-4e49-a456-66c0ee53f865\") " pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:50.786488 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.786449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"image-registry-658bf4bccb-bwzwj\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:50.788291 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.788273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Mar 12 13:38:50.796208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.796187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37e0d595-d2d4-4caf-8700-ffaeb5c3401f-metrics-certs\") pod \"network-metrics-daemon-wbtg8\" (UID: \"37e0d595-d2d4-4caf-8700-ffaeb5c3401f\") " pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:50.884715 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.884673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:50.884914 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.884728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:50.887732 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.887704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Mar 12 13:38:50.888320 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.888301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/f25d8f1b-e18e-4880-899f-71ae7bff54be-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-b58cd5d8d-vw6kv\" (UID: \"f25d8f1b-e18e-4880-899f-71ae7bff54be\") " pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:50.897235 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.897212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05421762-de84-4ca6-bedf-542c55460252-original-pull-secret\") pod \"global-pull-secret-syncer-x4rlt\" (UID: \"05421762-de84-4ca6-bedf-542c55460252\") " pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:50.980705 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.980673 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-9d7nr\"" Mar 12 13:38:50.985047 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.985028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:50.985127 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.985060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:50.985127 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.985099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:50.985224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.985162 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:50.985224 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.985209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:50.985562 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.985533 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-b6zd6\"" Mar 12 13:38:50.987812 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.987781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca-samples-operator-tls\") pod \"cluster-samples-operator-d5df4776c-clm4m\" (UID: \"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:50.987935 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.987823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:50.988088 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.988066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/99423593-d38b-4662-bacc-b1f6bb3d0d90-metrics-tls\") pod \"dns-default-ddks7\" (UID: \"99423593-d38b-4662-bacc-b1f6bb3d0d90\") " pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:50.988196 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.988177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/12cccf55-aade-4101-9250-96d5b498a6af-networking-console-plugin-cert\") pod \"networking-console-plugin-55b77584bb-dx5pp\" (UID: \"12cccf55-aade-4101-9250-96d5b498a6af\") " pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:50.988255 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.988181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92914013-06d0-470c-b843-48a641e447a6-cert\") pod \"ingress-canary-v9hjg\" (UID: \"92914013-06d0-470c-b843-48a641e447a6\") " pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:50.988344 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.988325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztsv5\" (UniqueName: \"kubernetes.io/projected/a2b57a5c-1736-4008-9bcc-41669382f70a-kube-api-access-ztsv5\") pod \"network-check-target-fnfhc\" (UID: \"a2b57a5c-1736-4008-9bcc-41669382f70a\") " pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:50.992900 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:50.992878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:51.025904 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.025872 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-sj9kz\"" Mar 12 13:38:51.034052 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.033959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:51.037150 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.037124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wbttq\"" Mar 12 13:38:51.039854 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.038123 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x4rlt" Mar 12 13:38:51.042414 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.042277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wbtg8" Mar 12 13:38:51.045932 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.045480 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-g78qz\"" Mar 12 13:38:51.053006 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.052587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" Mar 12 13:38:51.086634 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.086130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:51.091143 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.091063 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/035b3ed4-c514-4d2a-a1a9-e485a1c21535-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rpc6g\" (UID: \"035b3ed4-c514-4d2a-a1a9-e485a1c21535\") " pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:51.131730 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.127252 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-j7ncg\"" Mar 12 13:38:51.133716 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.133625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" Mar 12 13:38:51.177560 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.175148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-658bf4bccb-bwzwj"] Mar 12 13:38:51.177560 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.175356 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4sgvv\"" Mar 12 13:38:51.177560 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.175745 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-npwgp\"" Mar 12 13:38:51.188722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.183622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" Mar 12 13:38:51.188722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.184086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rpc6g" Mar 12 13:38:51.188722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.187226 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-f747dc4bb-qfvq7"] Mar 12 13:38:51.207829 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.207773 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod327ec7eb_a5de_4709_9fbc_6bbe58d87674.slice/crio-9c04d00ee434bbee8c6ad1cfffc57e83fb459da0ab3a0570c97b080b2dc71ac1 WatchSource:0}: Error finding container 9c04d00ee434bbee8c6ad1cfffc57e83fb459da0ab3a0570c97b080b2dc71ac1: Status 404 returned error can't find the container with id 9c04d00ee434bbee8c6ad1cfffc57e83fb459da0ab3a0570c97b080b2dc71ac1 Mar 12 13:38:51.213742 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.210924 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6llcj\"" Mar 12 13:38:51.217812 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.217139 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4fw8j\"" Mar 12 13:38:51.217812 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.217367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v9hjg" Mar 12 13:38:51.222529 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.222243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:51.243472 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.243418 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc11ab84b_7d66_4e49_a456_66c0ee53f865.slice/crio-eec15a9aca488d9bac48a79116be58e7571522227e26dd642ce322d11c49a237 WatchSource:0}: Error finding container eec15a9aca488d9bac48a79116be58e7571522227e26dd642ce322d11c49a237: Status 404 returned error can't find the container with id eec15a9aca488d9bac48a79116be58e7571522227e26dd642ce322d11c49a237 Mar 12 13:38:51.306081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.305690 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fnfhc"] Mar 12 13:38:51.306081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.305737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x4rlt"] Mar 12 13:38:51.308519 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.308488 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b57a5c_1736_4008_9bcc_41669382f70a.slice/crio-313c138bb20a9eb1f3afbce1e211bd76651f2600fa6632726f74c6465622fc4b WatchSource:0}: Error finding container 313c138bb20a9eb1f3afbce1e211bd76651f2600fa6632726f74c6465622fc4b: Status 404 returned error can't find the container with id 313c138bb20a9eb1f3afbce1e211bd76651f2600fa6632726f74c6465622fc4b Mar 12 13:38:51.335298 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.333187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fnfhc" event={"ID":"a2b57a5c-1736-4008-9bcc-41669382f70a","Type":"ContainerStarted","Data":"313c138bb20a9eb1f3afbce1e211bd76651f2600fa6632726f74c6465622fc4b"} Mar 12 13:38:51.335298 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.334869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" event={"ID":"c11ab84b-7d66-4e49-a456-66c0ee53f865","Type":"ContainerStarted","Data":"eec15a9aca488d9bac48a79116be58e7571522227e26dd642ce322d11c49a237"} Mar 12 13:38:51.344508 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.344478 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wbtg8"] Mar 12 13:38:51.361234 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.361130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" event={"ID":"327ec7eb-a5de-4709-9fbc-6bbe58d87674","Type":"ContainerStarted","Data":"9c04d00ee434bbee8c6ad1cfffc57e83fb459da0ab3a0570c97b080b2dc71ac1"} Mar 12 13:38:51.364480 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.363579 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37e0d595_d2d4_4caf_8700_ffaeb5c3401f.slice/crio-e1d09d91fe251c6c99ab0b86086e3a32168f2585c1d4b70cb696e671afee6ceb WatchSource:0}: Error finding container e1d09d91fe251c6c99ab0b86086e3a32168f2585c1d4b70cb696e671afee6ceb: Status 404 returned error can't find the container with id e1d09d91fe251c6c99ab0b86086e3a32168f2585c1d4b70cb696e671afee6ceb Mar 12 13:38:51.376093 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.376026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x4rlt" event={"ID":"05421762-de84-4ca6-bedf-542c55460252","Type":"ContainerStarted","Data":"d6ed758cb8021a12d4158e5faba43b29fb23a9703a4138e95bde1e80ea2a3cb7"} Mar 12 13:38:51.407379 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.407331 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv"] Mar 12 13:38:51.423676 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.422577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf25d8f1b_e18e_4880_899f_71ae7bff54be.slice/crio-50c3d5228caad95b3477f9cfaed57da5c0367902123c57866ec11d3c7829a455 WatchSource:0}: Error finding container 50c3d5228caad95b3477f9cfaed57da5c0367902123c57866ec11d3c7829a455: Status 404 returned error can't find the container with id 50c3d5228caad95b3477f9cfaed57da5c0367902123c57866ec11d3c7829a455 Mar 12 13:38:51.434495 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.433854 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-55b77584bb-dx5pp"] Mar 12 13:38:51.445820 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.443767 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12cccf55_aade_4101_9250_96d5b498a6af.slice/crio-3c61963d83cf59f63fc636f0b07a3955de26bf2aab05b0c1f7ac20b9b80addca WatchSource:0}: Error finding container 3c61963d83cf59f63fc636f0b07a3955de26bf2aab05b0c1f7ac20b9b80addca: Status 404 returned error can't find the container with id 3c61963d83cf59f63fc636f0b07a3955de26bf2aab05b0c1f7ac20b9b80addca Mar 12 13:38:51.473970 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.473907 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m"] Mar 12 13:38:51.495781 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.495749 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rpc6g"] Mar 12 13:38:51.508409 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.508386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v9hjg"] Mar 12 13:38:51.512014 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:51.511977 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ddks7"] Mar 12 13:38:51.512406 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.512376 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92914013_06d0_470c_b843_48a641e447a6.slice/crio-9e60069022f9d390ea503eefae2e9309120555995560feaf9407ee1a08e0cfe8 WatchSource:0}: Error finding container 9e60069022f9d390ea503eefae2e9309120555995560feaf9407ee1a08e0cfe8: Status 404 returned error can't find the container with id 9e60069022f9d390ea503eefae2e9309120555995560feaf9407ee1a08e0cfe8 Mar 12 13:38:51.516015 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:38:51.515989 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99423593_d38b_4662_bacc_b1f6bb3d0d90.slice/crio-5e10454520e781021034f6537fe7bbbc40f57c7edbef1293300a89ff0d8dd73f WatchSource:0}: Error finding container 5e10454520e781021034f6537fe7bbbc40f57c7edbef1293300a89ff0d8dd73f: Status 404 returned error can't find the container with id 5e10454520e781021034f6537fe7bbbc40f57c7edbef1293300a89ff0d8dd73f Mar 12 13:38:52.383051 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.383011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" event={"ID":"327ec7eb-a5de-4709-9fbc-6bbe58d87674","Type":"ContainerStarted","Data":"1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74"} Mar 12 13:38:52.383930 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.383905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:38:52.391915 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.391850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" event={"ID":"12cccf55-aade-4101-9250-96d5b498a6af","Type":"ContainerStarted","Data":"3c61963d83cf59f63fc636f0b07a3955de26bf2aab05b0c1f7ac20b9b80addca"} Mar 12 13:38:52.393644 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.393587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v9hjg" event={"ID":"92914013-06d0-470c-b843-48a641e447a6","Type":"ContainerStarted","Data":"9e60069022f9d390ea503eefae2e9309120555995560feaf9407ee1a08e0cfe8"} Mar 12 13:38:52.395403 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.395373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpc6g" event={"ID":"035b3ed4-c514-4d2a-a1a9-e485a1c21535","Type":"ContainerStarted","Data":"7e45fa238deac0e22ddc0025c7bf278f5f2a73f1fdde1741322ecb45026af61d"} Mar 12 13:38:52.395548 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.395413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpc6g" event={"ID":"035b3ed4-c514-4d2a-a1a9-e485a1c21535","Type":"ContainerStarted","Data":"39d8276c4ebdb84f97814093bf87e0fd1cb052efefe73c28687944d4420f99ac"} Mar 12 13:38:52.397596 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.397274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fnfhc" event={"ID":"a2b57a5c-1736-4008-9bcc-41669382f70a","Type":"ContainerStarted","Data":"2bac0c3b38e759414ac14245af4e33fde90d3f3b3c9b6dd583fff97417bcdaa0"} Mar 12 13:38:52.397807 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.397756 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:38:52.401054 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.400560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" event={"ID":"c11ab84b-7d66-4e49-a456-66c0ee53f865","Type":"ContainerStarted","Data":"0d2178158c86a25cf1c33fd153433f1be740bb815caf4cbfba21a1682e50ae81"} Mar 12 13:38:52.406082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.405819 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" podStartSLOduration=67.405803347 podStartE2EDuration="1m7.405803347s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:52.40458059 +0000 UTC m=+67.916514642" watchObservedRunningTime="2026-03-12 13:38:52.405803347 +0000 UTC m=+67.917737375" Mar 12 13:38:52.408331 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.408278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" event={"ID":"f25d8f1b-e18e-4880-899f-71ae7bff54be","Type":"ContainerStarted","Data":"50c3d5228caad95b3477f9cfaed57da5c0367902123c57866ec11d3c7829a455"} Mar 12 13:38:52.421545 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.421492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wbtg8" event={"ID":"37e0d595-d2d4-4caf-8700-ffaeb5c3401f","Type":"ContainerStarted","Data":"e1d09d91fe251c6c99ab0b86086e3a32168f2585c1d4b70cb696e671afee6ceb"} Mar 12 13:38:52.426357 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.426304 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fnfhc" podStartSLOduration=67.42628813 podStartE2EDuration="1m7.42628813s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:52.424535632 +0000 UTC m=+67.936469660" watchObservedRunningTime="2026-03-12 13:38:52.42628813 +0000 UTC m=+67.938222157" Mar 12 13:38:52.430566 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.430495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ddks7" event={"ID":"99423593-d38b-4662-bacc-b1f6bb3d0d90","Type":"ContainerStarted","Data":"5e10454520e781021034f6537fe7bbbc40f57c7edbef1293300a89ff0d8dd73f"} Mar 12 13:38:52.431915 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.431885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" event={"ID":"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca","Type":"ContainerStarted","Data":"6d5986bd063e132fda457d758c19f522843ca4913a687c71ec9f1769386af676"} Mar 12 13:38:52.455685 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.454792 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" podStartSLOduration=49.454771516 podStartE2EDuration="49.454771516s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:38:52.453729602 +0000 UTC m=+67.965663631" watchObservedRunningTime="2026-03-12 13:38:52.454771516 +0000 UTC m=+67.966705544" Mar 12 13:38:52.993439 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.993376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:52.996068 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:52.996033 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:53.438943 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:53.438853 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:53.440992 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:53.440812 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-f747dc4bb-qfvq7" Mar 12 13:38:59.295166 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.294163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6"] Mar 12 13:38:59.311082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.311049 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6"] Mar 12 13:38:59.311233 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.311134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:38:59.314328 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.314144 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Mar 12 13:38:59.314328 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.314282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-7dww4\"" Mar 12 13:38:59.363891 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.363823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ef8299ba-434c-47bc-9ea7-ed35925f508a-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-8smw6\" (UID: \"ef8299ba-434c-47bc-9ea7-ed35925f508a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:38:59.458916 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.458877 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ddks7" event={"ID":"99423593-d38b-4662-bacc-b1f6bb3d0d90","Type":"ContainerStarted","Data":"5ed8ee40a29f110287bf44549a700cdc93eb5c074a684f9473ec3f740fd19f6f"} Mar 12 13:38:59.459069 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.458921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ddks7" event={"ID":"99423593-d38b-4662-bacc-b1f6bb3d0d90","Type":"ContainerStarted","Data":"5c0030b223b295d04a88821282b6a7f79dd783abb18f94dfd21daef0e327b20a"} Mar 12 13:38:59.459069 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.459027 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ddks7" Mar 12 13:38:59.461136 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.461098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" event={"ID":"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca","Type":"ContainerStarted","Data":"d09c6045f8b393ef09af03b789a10cd8d6ca4aa3bf50a82ba3cd35b706bfe4c8"} Mar 12 13:38:59.461271 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.461139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" event={"ID":"f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca","Type":"ContainerStarted","Data":"dc78fc67a55f270e2fcca9d3380752b136293cf15c45de709ce8d9ff6e378b94"} Mar 12 13:38:59.463373 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.463345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" event={"ID":"12cccf55-aade-4101-9250-96d5b498a6af","Type":"ContainerStarted","Data":"7efd8bf99bf8d5753a40d4fbe4861c99e9db820ac2500296cc2f233b31c38d60"} Mar 12 13:38:59.464484 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.464435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ef8299ba-434c-47bc-9ea7-ed35925f508a-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-8smw6\" (UID: \"ef8299ba-434c-47bc-9ea7-ed35925f508a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:38:59.465087 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.465057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x4rlt" event={"ID":"05421762-de84-4ca6-bedf-542c55460252","Type":"ContainerStarted","Data":"6af3b4c5d4019030d205c5cf2f091fa7ce3f3d1b9580c39e25d9eaf7705d9e09"} Mar 12 13:38:59.467760 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.467736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v9hjg" event={"ID":"92914013-06d0-470c-b843-48a641e447a6","Type":"ContainerStarted","Data":"e9a247f58cf32ea6f21a6be3fd0de6cb790d9decc1a12611908dbc0a41e0319d"} Mar 12 13:38:59.469342 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.469318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpc6g" event={"ID":"035b3ed4-c514-4d2a-a1a9-e485a1c21535","Type":"ContainerStarted","Data":"caf8b26118d7d97ce90de1056c8a7d157655542eec85a4f22e71af7796ecb9c6"} Mar 12 13:38:59.469991 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.469971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/ef8299ba-434c-47bc-9ea7-ed35925f508a-tls-certificates\") pod \"prometheus-operator-admission-webhook-8444df798b-8smw6\" (UID: \"ef8299ba-434c-47bc-9ea7-ed35925f508a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:38:59.471334 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.471304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" event={"ID":"f25d8f1b-e18e-4880-899f-71ae7bff54be","Type":"ContainerStarted","Data":"02bed2ad1c74337f5f1cbca9e2625311a368741f5b16c24c454f393616e0639c"} Mar 12 13:38:59.473174 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.473154 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wbtg8" event={"ID":"37e0d595-d2d4-4caf-8700-ffaeb5c3401f","Type":"ContainerStarted","Data":"d0a8dccf0da79f1cf6943e59bbb05751cd8870bdbe150ed6957cd21d218122c5"} Mar 12 13:38:59.473271 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.473179 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wbtg8" event={"ID":"37e0d595-d2d4-4caf-8700-ffaeb5c3401f","Type":"ContainerStarted","Data":"4a55334ba7318c24670bf1af29cce8d64aaa0242bead2eb90c6052f0338fa859"} Mar 12 13:38:59.479809 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.479765 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ddks7" podStartSLOduration=33.614500881 podStartE2EDuration="40.479751604s" podCreationTimestamp="2026-03-12 13:38:19 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.518180475 +0000 UTC m=+67.030114480" lastFinishedPulling="2026-03-12 13:38:58.383431198 +0000 UTC m=+73.895365203" observedRunningTime="2026-03-12 13:38:59.477775182 +0000 UTC m=+74.989709209" watchObservedRunningTime="2026-03-12 13:38:59.479751604 +0000 UTC m=+74.991685632" Mar 12 13:38:59.494907 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.494866 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v9hjg" podStartSLOduration=34.653039998 podStartE2EDuration="41.494854067s" podCreationTimestamp="2026-03-12 13:38:18 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.514516896 +0000 UTC m=+67.026450900" lastFinishedPulling="2026-03-12 13:38:58.356330957 +0000 UTC m=+73.868264969" observedRunningTime="2026-03-12 13:38:59.493680153 +0000 UTC m=+75.005614180" watchObservedRunningTime="2026-03-12 13:38:59.494854067 +0000 UTC m=+75.006788124" Mar 12 13:38:59.511871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.511822 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-d5df4776c-clm4m" podStartSLOduration=49.653476366 podStartE2EDuration="56.511807136s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.525036071 +0000 UTC m=+67.036970076" lastFinishedPulling="2026-03-12 13:38:58.383366839 +0000 UTC m=+73.895300846" observedRunningTime="2026-03-12 13:38:59.510823734 +0000 UTC m=+75.022757761" watchObservedRunningTime="2026-03-12 13:38:59.511807136 +0000 UTC m=+75.023741165" Mar 12 13:38:59.528384 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.528337 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-b58cd5d8d-vw6kv" podStartSLOduration=49.568143821 podStartE2EDuration="56.528321881s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.426421784 +0000 UTC m=+66.938355789" lastFinishedPulling="2026-03-12 13:38:58.386599844 +0000 UTC m=+73.898533849" observedRunningTime="2026-03-12 13:38:59.527247922 +0000 UTC m=+75.039181949" watchObservedRunningTime="2026-03-12 13:38:59.528321881 +0000 UTC m=+75.040255909" Mar 12 13:38:59.543683 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.543627 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-55b77584bb-dx5pp" podStartSLOduration=47.659626578 podStartE2EDuration="54.543610717s" podCreationTimestamp="2026-03-12 13:38:05 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.448249776 +0000 UTC m=+66.960183794" lastFinishedPulling="2026-03-12 13:38:58.332233914 +0000 UTC m=+73.844167933" observedRunningTime="2026-03-12 13:38:59.542503941 +0000 UTC m=+75.054437968" watchObservedRunningTime="2026-03-12 13:38:59.543610717 +0000 UTC m=+75.055544748" Mar 12 13:38:59.558396 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.558346 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wbtg8" podStartSLOduration=67.547674986 podStartE2EDuration="1m14.558331133s" podCreationTimestamp="2026-03-12 13:37:45 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.372433691 +0000 UTC m=+66.884367709" lastFinishedPulling="2026-03-12 13:38:58.383089833 +0000 UTC m=+73.895023856" observedRunningTime="2026-03-12 13:38:59.556914647 +0000 UTC m=+75.068848675" watchObservedRunningTime="2026-03-12 13:38:59.558331133 +0000 UTC m=+75.070265160" Mar 12 13:38:59.571807 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.571763 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x4rlt" podStartSLOduration=66.174782665 podStartE2EDuration="1m13.571747591s" podCreationTimestamp="2026-03-12 13:37:46 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.319538029 +0000 UTC m=+66.831472037" lastFinishedPulling="2026-03-12 13:38:58.716502951 +0000 UTC m=+74.228436963" observedRunningTime="2026-03-12 13:38:59.571164282 +0000 UTC m=+75.083098310" watchObservedRunningTime="2026-03-12 13:38:59.571747591 +0000 UTC m=+75.083681622" Mar 12 13:38:59.623421 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.623351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:38:59.755032 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:38:59.755000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6"] Mar 12 13:39:00.116352 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:00.116321 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef8299ba_434c_47bc_9ea7_ed35925f508a.slice/crio-3b13ecb4a309cdc23a412bd26202f8951e7c1b954f2e41cd452e916c8499f1d6 WatchSource:0}: Error finding container 3b13ecb4a309cdc23a412bd26202f8951e7c1b954f2e41cd452e916c8499f1d6: Status 404 returned error can't find the container with id 3b13ecb4a309cdc23a412bd26202f8951e7c1b954f2e41cd452e916c8499f1d6 Mar 12 13:39:00.477286 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:00.477234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" event={"ID":"ef8299ba-434c-47bc-9ea7-ed35925f508a","Type":"ContainerStarted","Data":"3b13ecb4a309cdc23a412bd26202f8951e7c1b954f2e41cd452e916c8499f1d6"} Mar 12 13:39:01.482352 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:01.482314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rpc6g" event={"ID":"035b3ed4-c514-4d2a-a1a9-e485a1c21535","Type":"ContainerStarted","Data":"3742d39c47ff5483d8f84a4809f4cc3f44eadec65a4c86a37ee6f9b25410b497"} Mar 12 13:39:01.520569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:01.520519 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rpc6g" podStartSLOduration=17.71261867 podStartE2EDuration="26.520504415s" podCreationTimestamp="2026-03-12 13:38:35 +0000 UTC" firstStartedPulling="2026-03-12 13:38:51.56737525 +0000 UTC m=+67.079309262" lastFinishedPulling="2026-03-12 13:39:00.375261001 +0000 UTC m=+75.887195007" observedRunningTime="2026-03-12 13:39:01.519420318 +0000 UTC m=+77.031354345" watchObservedRunningTime="2026-03-12 13:39:01.520504415 +0000 UTC m=+77.032438465" Mar 12 13:39:02.081859 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.081713 2576 scope.go:117] "RemoveContainer" containerID="0f4e642337fa5fad845cb0638c733f4d88216bea18b707838b95a6bd804179af" Mar 12 13:39:02.486240 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.486207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" event={"ID":"ef8299ba-434c-47bc-9ea7-ed35925f508a","Type":"ContainerStarted","Data":"e86f40379fb7d16d8c16b61550bd365bcca0c0103cb6f575838e4cc144c47881"} Mar 12 13:39:02.486740 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.486387 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:39:02.487985 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.487963 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:39:02.488111 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.488082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-76b8565867-8c2br" event={"ID":"b87ad3f0-c310-4833-bfac-cd1e3b15a7d9","Type":"ContainerStarted","Data":"2575a251f33c8df462c25d06141d7039e800b2ead8369d56ac7c1e167c50412b"} Mar 12 13:39:02.488517 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.488490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:39:02.491770 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.491753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" Mar 12 13:39:02.506150 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.506104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8444df798b-8smw6" podStartSLOduration=1.890514943 podStartE2EDuration="3.506092319s" podCreationTimestamp="2026-03-12 13:38:59 +0000 UTC" firstStartedPulling="2026-03-12 13:39:00.118230358 +0000 UTC m=+75.630164363" lastFinishedPulling="2026-03-12 13:39:01.733807728 +0000 UTC m=+77.245741739" observedRunningTime="2026-03-12 13:39:02.504710388 +0000 UTC m=+78.016644416" watchObservedRunningTime="2026-03-12 13:39:02.506092319 +0000 UTC m=+78.018026375" Mar 12 13:39:02.531189 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.531126 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-76b8565867-8c2br" podStartSLOduration=50.313482969 podStartE2EDuration="59.531105968s" podCreationTimestamp="2026-03-12 13:38:03 +0000 UTC" firstStartedPulling="2026-03-12 13:38:19.752905779 +0000 UTC m=+35.264839797" lastFinishedPulling="2026-03-12 13:38:28.970528777 +0000 UTC m=+44.482462796" observedRunningTime="2026-03-12 13:39:02.529216133 +0000 UTC m=+78.041150159" watchObservedRunningTime="2026-03-12 13:39:02.531105968 +0000 UTC m=+78.043040000" Mar 12 13:39:02.692250 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.692222 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-76b8565867-8c2br" Mar 12 13:39:02.891760 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.891711 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-5b85974fd6-sqphd"] Mar 12 13:39:02.932479 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.932431 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-sqphd"] Mar 12 13:39:02.932632 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.932581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:02.935396 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.935363 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-hfrmn\"" Mar 12 13:39:02.935573 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.935427 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Mar 12 13:39:02.935697 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.935675 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Mar 12 13:39:02.996529 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:02.996480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc9kq\" (UniqueName: \"kubernetes.io/projected/4dac3f26-1539-4a57-8570-dba478ffb63f-kube-api-access-qc9kq\") pod \"downloads-5b85974fd6-sqphd\" (UID: \"4dac3f26-1539-4a57-8570-dba478ffb63f\") " pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:03.097537 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.097502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc9kq\" (UniqueName: \"kubernetes.io/projected/4dac3f26-1539-4a57-8570-dba478ffb63f-kube-api-access-qc9kq\") pod \"downloads-5b85974fd6-sqphd\" (UID: \"4dac3f26-1539-4a57-8570-dba478ffb63f\") " pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:03.106236 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.106213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc9kq\" (UniqueName: \"kubernetes.io/projected/4dac3f26-1539-4a57-8570-dba478ffb63f-kube-api-access-qc9kq\") pod \"downloads-5b85974fd6-sqphd\" (UID: \"4dac3f26-1539-4a57-8570-dba478ffb63f\") " pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:03.241834 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.241747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:03.382529 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.382495 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-rbppk"] Mar 12 13:39:03.396507 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:03.396450 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dac3f26_1539_4a57_8570_dba478ffb63f.slice/crio-05a5363c0d5808252b035d67dc2ea170a0c687a4d961b9277247c65787a9d024 WatchSource:0}: Error finding container 05a5363c0d5808252b035d67dc2ea170a0c687a4d961b9277247c65787a9d024: Status 404 returned error can't find the container with id 05a5363c0d5808252b035d67dc2ea170a0c687a4d961b9277247c65787a9d024 Mar 12 13:39:03.418200 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.418170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-5b85974fd6-sqphd"] Mar 12 13:39:03.418200 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.418198 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-rbppk"] Mar 12 13:39:03.418346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.418304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.423028 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.423007 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Mar 12 13:39:03.423583 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.423568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Mar 12 13:39:03.428076 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.428056 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-v5r4p\"" Mar 12 13:39:03.431265 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.431247 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Mar 12 13:39:03.496389 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.496306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-sqphd" event={"ID":"4dac3f26-1539-4a57-8570-dba478ffb63f","Type":"ContainerStarted","Data":"05a5363c0d5808252b035d67dc2ea170a0c687a4d961b9277247c65787a9d024"} Mar 12 13:39:03.501110 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.501086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.501199 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.501126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/799696f8-16a2-4084-b56e-ea433f73f682-metrics-client-ca\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.501199 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.501144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.501199 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.501175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rf7\" (UniqueName: \"kubernetes.io/projected/799696f8-16a2-4084-b56e-ea433f73f682-kube-api-access-x6rf7\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.602235 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.602201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/799696f8-16a2-4084-b56e-ea433f73f682-metrics-client-ca\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.602235 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.602238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.602439 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.602259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rf7\" (UniqueName: \"kubernetes.io/projected/799696f8-16a2-4084-b56e-ea433f73f682-kube-api-access-x6rf7\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.602439 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:03.602385 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 12 13:39:03.602547 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:03.602452 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-tls podName:799696f8-16a2-4084-b56e-ea433f73f682 nodeName:}" failed. No retries permitted until 2026-03-12 13:39:04.102431281 +0000 UTC m=+79.614365300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-tls") pod "prometheus-operator-6b948c769-rbppk" (UID: "799696f8-16a2-4084-b56e-ea433f73f682") : secret "prometheus-operator-tls" not found Mar 12 13:39:03.602589 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.602548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.603065 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.603040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/799696f8-16a2-4084-b56e-ea433f73f682-metrics-client-ca\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.605155 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.605133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:03.613006 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:03.612984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rf7\" (UniqueName: \"kubernetes.io/projected/799696f8-16a2-4084-b56e-ea433f73f682-kube-api-access-x6rf7\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:04.107213 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:04.107175 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:04.109986 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:04.109961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/799696f8-16a2-4084-b56e-ea433f73f682-prometheus-operator-tls\") pod \"prometheus-operator-6b948c769-rbppk\" (UID: \"799696f8-16a2-4084-b56e-ea433f73f682\") " pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:04.327635 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:04.327589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" Mar 12 13:39:04.472679 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:04.472638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-6b948c769-rbppk"] Mar 12 13:39:04.476407 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:04.476375 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod799696f8_16a2_4084_b56e_ea433f73f682.slice/crio-dc28881ec5813d335b8daaa9b2af1a1d96339107e895f795d274b2221d2a6973 WatchSource:0}: Error finding container dc28881ec5813d335b8daaa9b2af1a1d96339107e895f795d274b2221d2a6973: Status 404 returned error can't find the container with id dc28881ec5813d335b8daaa9b2af1a1d96339107e895f795d274b2221d2a6973 Mar 12 13:39:04.500240 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:04.500202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" event={"ID":"799696f8-16a2-4084-b56e-ea433f73f682","Type":"ContainerStarted","Data":"dc28881ec5813d335b8daaa9b2af1a1d96339107e895f795d274b2221d2a6973"} Mar 12 13:39:06.509964 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:06.509919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" event={"ID":"799696f8-16a2-4084-b56e-ea433f73f682","Type":"ContainerStarted","Data":"680c1a7d204ea8c511ba0708c2f080e3c68b1f478546f7c5163e93cd70d24f71"} Mar 12 13:39:06.510513 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:06.509971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" event={"ID":"799696f8-16a2-4084-b56e-ea433f73f682","Type":"ContainerStarted","Data":"7e5aa7c4de8c293e75c940654c98c67bfbfd7d9db8d08580289c12a05f0b57ef"} Mar 12 13:39:06.529151 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:06.529093 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-6b948c769-rbppk" podStartSLOduration=1.886358714 podStartE2EDuration="3.529074715s" podCreationTimestamp="2026-03-12 13:39:03 +0000 UTC" firstStartedPulling="2026-03-12 13:39:04.478566933 +0000 UTC m=+79.990500937" lastFinishedPulling="2026-03-12 13:39:06.121282916 +0000 UTC m=+81.633216938" observedRunningTime="2026-03-12 13:39:06.526898974 +0000 UTC m=+82.038833003" watchObservedRunningTime="2026-03-12 13:39:06.529074715 +0000 UTC m=+82.041008745" Mar 12 13:39:08.776343 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.776309 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd"] Mar 12 13:39:08.781453 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.781430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.789903 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.789883 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-sqcsn\"" Mar 12 13:39:08.792380 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.792360 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Mar 12 13:39:08.799326 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.799305 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Mar 12 13:39:08.808264 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.808233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk"] Mar 12 13:39:08.811951 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.811930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd"] Mar 12 13:39:08.812078 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.812070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.825539 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.825516 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Mar 12 13:39:08.826417 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.826396 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Mar 12 13:39:08.832516 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.832437 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Mar 12 13:39:08.844611 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.844584 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-pxhhh\"" Mar 12 13:39:08.849926 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.849898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.850040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.849943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35520300-1852-426c-b98c-0d7a5de8f5d1-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.850040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.849979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a686a49-082d-4561-b5c9-4bf4c4cc0943-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.850040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a686a49-082d-4561-b5c9-4bf4c4cc0943-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.850183 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfcjc\" (UniqueName: \"kubernetes.io/projected/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-api-access-nfcjc\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.850183 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/35520300-1852-426c-b98c-0d7a5de8f5d1-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.850183 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjwnv\" (UniqueName: \"kubernetes.io/projected/8a686a49-082d-4561-b5c9-4bf4c4cc0943-kube-api-access-rjwnv\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.850296 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8a686a49-082d-4561-b5c9-4bf4c4cc0943-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.850296 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.850391 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.850311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.853081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.853051 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk"] Mar 12 13:39:08.893192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.893151 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-22tl6"] Mar 12 13:39:08.914198 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.914168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.916896 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.916872 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Mar 12 13:39:08.917234 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.917216 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Mar 12 13:39:08.917504 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.917490 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Mar 12 13:39:08.917719 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.917704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cf95r\"" Mar 12 13:39:08.951008 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.950971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.951208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-root\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.951208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.951208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951122 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35520300-1852-426c-b98c-0d7a5de8f5d1-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.951208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-wtmp\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.951208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnt5v\" (UniqueName: \"kubernetes.io/projected/0c2c8159-b212-44a8-bcfa-0dd001c47b97-kube-api-access-cnt5v\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.951208 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:08.951205 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 12 13:39:08.951573 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:08.951268 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls podName:35520300-1852-426c-b98c-0d7a5de8f5d1 nodeName:}" failed. No retries permitted until 2026-03-12 13:39:09.451248038 +0000 UTC m=+84.963182060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls") pod "kube-state-metrics-6df7999c47-f2ksk" (UID: "35520300-1852-426c-b98c-0d7a5de8f5d1") : secret "kube-state-metrics-tls" not found Mar 12 13:39:08.951573 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a686a49-082d-4561-b5c9-4bf4c4cc0943-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.951573 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c2c8159-b212-44a8-bcfa-0dd001c47b97-metrics-client-ca\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.951573 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a686a49-082d-4561-b5c9-4bf4c4cc0943-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.951573 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-tls\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfcjc\" (UniqueName: \"kubernetes.io/projected/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-api-access-nfcjc\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/35520300-1852-426c-b98c-0d7a5de8f5d1-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjwnv\" (UniqueName: \"kubernetes.io/projected/8a686a49-082d-4561-b5c9-4bf4c4cc0943-kube-api-access-rjwnv\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-sys\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-textfile\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.951981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8a686a49-082d-4561-b5c9-4bf4c4cc0943-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.952031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-accelerators-collector-config\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.952090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.952406 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.952240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/35520300-1852-426c-b98c-0d7a5de8f5d1-volume-directive-shadow\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.952874 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.952834 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a686a49-082d-4561-b5c9-4bf4c4cc0943-metrics-client-ca\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.953110 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.953000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.953401 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.953369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35520300-1852-426c-b98c-0d7a5de8f5d1-metrics-client-ca\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.954401 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.954376 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/8a686a49-082d-4561-b5c9-4bf4c4cc0943-openshift-state-metrics-tls\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.955057 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.955032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.956105 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.956082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8a686a49-082d-4561-b5c9-4bf4c4cc0943-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:08.975155 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.975127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfcjc\" (UniqueName: \"kubernetes.io/projected/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-api-access-nfcjc\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:08.975402 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:08.975378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjwnv\" (UniqueName: \"kubernetes.io/projected/8a686a49-082d-4561-b5c9-4bf4c4cc0943-kube-api-access-rjwnv\") pod \"openshift-state-metrics-68b5d5d464-8bgkd\" (UID: \"8a686a49-082d-4561-b5c9-4bf4c4cc0943\") " pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:09.053063 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.052965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-root\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053063 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-wtmp\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053063 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053060 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnt5v\" (UniqueName: \"kubernetes.io/projected/0c2c8159-b212-44a8-bcfa-0dd001c47b97-kube-api-access-cnt5v\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053091 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c2c8159-b212-44a8-bcfa-0dd001c47b97-metrics-client-ca\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-root\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-tls\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-sys\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-textfile\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053250 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-wtmp\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-accelerators-collector-config\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053739 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0c2c8159-b212-44a8-bcfa-0dd001c47b97-sys\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.053813 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.053790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-textfile\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.054085 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.054060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-accelerators-collector-config\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.054182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.054129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c2c8159-b212-44a8-bcfa-0dd001c47b97-metrics-client-ca\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.056152 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.056130 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-tls\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.056253 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.056225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0c2c8159-b212-44a8-bcfa-0dd001c47b97-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.062323 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.062300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnt5v\" (UniqueName: \"kubernetes.io/projected/0c2c8159-b212-44a8-bcfa-0dd001c47b97-kube-api-access-cnt5v\") pod \"node-exporter-22tl6\" (UID: \"0c2c8159-b212-44a8-bcfa-0dd001c47b97\") " pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.092338 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.092306 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" Mar 12 13:39:09.226919 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.226885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-22tl6" Mar 12 13:39:09.238428 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.238402 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd"] Mar 12 13:39:09.239953 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:09.239922 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c2c8159_b212_44a8_bcfa_0dd001c47b97.slice/crio-7f7cb4f3609f2c61fd8e55cbce84cc715b4ea99072dd91924845c9d9d312461c WatchSource:0}: Error finding container 7f7cb4f3609f2c61fd8e55cbce84cc715b4ea99072dd91924845c9d9d312461c: Status 404 returned error can't find the container with id 7f7cb4f3609f2c61fd8e55cbce84cc715b4ea99072dd91924845c9d9d312461c Mar 12 13:39:09.240960 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:09.240928 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a686a49_082d_4561_b5c9_4bf4c4cc0943.slice/crio-75f8617f4efe8abbb8a30f45607c9896b2197327fb808f72b45425020751e053 WatchSource:0}: Error finding container 75f8617f4efe8abbb8a30f45607c9896b2197327fb808f72b45425020751e053: Status 404 returned error can't find the container with id 75f8617f4efe8abbb8a30f45607c9896b2197327fb808f72b45425020751e053 Mar 12 13:39:09.456516 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.456473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:09.456705 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:09.456666 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 12 13:39:09.456762 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:09.456743 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls podName:35520300-1852-426c-b98c-0d7a5de8f5d1 nodeName:}" failed. No retries permitted until 2026-03-12 13:39:10.456720047 +0000 UTC m=+85.968654072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls") pod "kube-state-metrics-6df7999c47-f2ksk" (UID: "35520300-1852-426c-b98c-0d7a5de8f5d1") : secret "kube-state-metrics-tls" not found Mar 12 13:39:09.480669 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.480638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ddks7" Mar 12 13:39:09.522540 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.522502 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-22tl6" event={"ID":"0c2c8159-b212-44a8-bcfa-0dd001c47b97","Type":"ContainerStarted","Data":"7f7cb4f3609f2c61fd8e55cbce84cc715b4ea99072dd91924845c9d9d312461c"} Mar 12 13:39:09.524535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.524441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" event={"ID":"8a686a49-082d-4561-b5c9-4bf4c4cc0943","Type":"ContainerStarted","Data":"ac30be3a5b57eff65b7ad560fa0af600f337bc1493b9fb4988ffad179460a90c"} Mar 12 13:39:09.524535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.524498 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" event={"ID":"8a686a49-082d-4561-b5c9-4bf4c4cc0943","Type":"ContainerStarted","Data":"813aad0d04bcfeda2bd72d6ebee06b558cd9030f110ebdb779dc86adbafd100d"} Mar 12 13:39:09.524535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.524514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" event={"ID":"8a686a49-082d-4561-b5c9-4bf4c4cc0943","Type":"ContainerStarted","Data":"75f8617f4efe8abbb8a30f45607c9896b2197327fb808f72b45425020751e053"} Mar 12 13:39:09.802521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.802485 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 13:39:09.807037 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.807008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.814313 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.814286 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Mar 12 13:39:09.814752 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.814728 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Mar 12 13:39:09.815036 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.815017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Mar 12 13:39:09.815240 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.815227 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-55rp7\"" Mar 12 13:39:09.815749 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.815400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Mar 12 13:39:09.815749 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.815605 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Mar 12 13:39:09.819081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.819047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Mar 12 13:39:09.819811 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.819794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Mar 12 13:39:09.819898 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.819868 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Mar 12 13:39:09.826535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.826112 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 13:39:09.826535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.826232 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Mar 12 13:39:09.860184 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860367 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860208 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860367 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hzp\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-kube-api-access-67hzp\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860367 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860367 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860604 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-config-volume\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860604 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-web-config\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860604 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860604 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-config-out\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.860937 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.860907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.861065 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.861005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.861065 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.861034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.861182 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.861144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-config-volume\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-web-config\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-config-out\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.962971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67hzp\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-kube-api-access-67hzp\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.963605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.964318 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:09.963757 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle podName:ad5bfa9a-7696-439c-943a-08df7f56f525 nodeName:}" failed. No retries permitted until 2026-03-12 13:39:10.463734572 +0000 UTC m=+85.975668589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525") : configmap references non-existent config key: ca-bundle.crt Mar 12 13:39:09.971124 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.970872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-web-config\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.971990 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.971936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.972442 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.972397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.972990 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.972797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.973354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.973310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.973449 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.973417 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.974040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.973996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-config-volume\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.974373 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.974350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-config-out\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.975002 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.974978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hzp\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-kube-api-access-67hzp\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.976553 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.976510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:09.977082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:09.977060 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:10.469608 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.469542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:10.469807 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.469625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:10.470959 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.470926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:10.473081 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.473051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/35520300-1852-426c-b98c-0d7a5de8f5d1-kube-state-metrics-tls\") pod \"kube-state-metrics-6df7999c47-f2ksk\" (UID: \"35520300-1852-426c-b98c-0d7a5de8f5d1\") " pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:10.530804 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.530770 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c2c8159-b212-44a8-bcfa-0dd001c47b97" containerID="f3b8f2694cd7b091efe54adf989e4af228343fe7bea3566670434084f44fba58" exitCode=0 Mar 12 13:39:10.530977 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.530903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-22tl6" event={"ID":"0c2c8159-b212-44a8-bcfa-0dd001c47b97","Type":"ContainerDied","Data":"f3b8f2694cd7b091efe54adf989e4af228343fe7bea3566670434084f44fba58"} Mar 12 13:39:10.624025 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.623988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" Mar 12 13:39:10.720825 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.720738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:39:10.994670 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.994632 2576 patch_prober.go:28] interesting pod/image-registry-658bf4bccb-bwzwj container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Mar 12 13:39:10.995148 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.994702 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" podUID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 12 13:39:10.999390 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:10.999362 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 13:39:11.010697 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:11.010669 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad5bfa9a_7696_439c_943a_08df7f56f525.slice/crio-ede5329b273c9c683b304dda6e70fb747d2dbabb23a49dfdd767b45dfafcf7cf WatchSource:0}: Error finding container ede5329b273c9c683b304dda6e70fb747d2dbabb23a49dfdd767b45dfafcf7cf: Status 404 returned error can't find the container with id ede5329b273c9c683b304dda6e70fb747d2dbabb23a49dfdd767b45dfafcf7cf Mar 12 13:39:11.020567 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.020419 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk"] Mar 12 13:39:11.025449 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:11.025293 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35520300_1852_426c_b98c_0d7a5de8f5d1.slice/crio-dabd4ef7001ae55489d8038d40cf3bc1cda5c24e39ab7d153243d3db20b29c81 WatchSource:0}: Error finding container dabd4ef7001ae55489d8038d40cf3bc1cda5c24e39ab7d153243d3db20b29c81: Status 404 returned error can't find the container with id dabd4ef7001ae55489d8038d40cf3bc1cda5c24e39ab7d153243d3db20b29c81 Mar 12 13:39:11.535338 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.535291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" event={"ID":"35520300-1852-426c-b98c-0d7a5de8f5d1","Type":"ContainerStarted","Data":"dabd4ef7001ae55489d8038d40cf3bc1cda5c24e39ab7d153243d3db20b29c81"} Mar 12 13:39:11.537306 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.537271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-22tl6" event={"ID":"0c2c8159-b212-44a8-bcfa-0dd001c47b97","Type":"ContainerStarted","Data":"59b6eb78366cb128c1ef3c23005fb8cb580f6e4008b7e495f420930f051e2a8a"} Mar 12 13:39:11.537488 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.537311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-22tl6" event={"ID":"0c2c8159-b212-44a8-bcfa-0dd001c47b97","Type":"ContainerStarted","Data":"0e1bc6b9fd8670cb1913148d7a9afe7c87380354b98d9c6ede9716407c976531"} Mar 12 13:39:11.538521 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.538484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"ede5329b273c9c683b304dda6e70fb747d2dbabb23a49dfdd767b45dfafcf7cf"} Mar 12 13:39:11.540744 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.540721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" event={"ID":"8a686a49-082d-4561-b5c9-4bf4c4cc0943","Type":"ContainerStarted","Data":"79fb71be223581cc8ab85480ae39b6015c81965402d8bf8c766ed34c88b8f61c"} Mar 12 13:39:11.567917 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.567855 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-22tl6" podStartSLOduration=2.676259407 podStartE2EDuration="3.567838041s" podCreationTimestamp="2026-03-12 13:39:08 +0000 UTC" firstStartedPulling="2026-03-12 13:39:09.242364791 +0000 UTC m=+84.754298796" lastFinishedPulling="2026-03-12 13:39:10.133943426 +0000 UTC m=+85.645877430" observedRunningTime="2026-03-12 13:39:11.56501609 +0000 UTC m=+87.076950118" watchObservedRunningTime="2026-03-12 13:39:11.567838041 +0000 UTC m=+87.079772069" Mar 12 13:39:11.597187 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.597125 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-68b5d5d464-8bgkd" podStartSLOduration=2.112507233 podStartE2EDuration="3.597102798s" podCreationTimestamp="2026-03-12 13:39:08 +0000 UTC" firstStartedPulling="2026-03-12 13:39:09.391420988 +0000 UTC m=+84.903354996" lastFinishedPulling="2026-03-12 13:39:10.87601655 +0000 UTC m=+86.387950561" observedRunningTime="2026-03-12 13:39:11.596683417 +0000 UTC m=+87.108617445" watchObservedRunningTime="2026-03-12 13:39:11.597102798 +0000 UTC m=+87.109036826" Mar 12 13:39:11.863849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.861829 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9"] Mar 12 13:39:11.865952 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.865913 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.869287 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.869264 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2mq8a94anlp25\"" Mar 12 13:39:11.869735 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.869713 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Mar 12 13:39:11.869838 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.869717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Mar 12 13:39:11.870096 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.870076 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-gcfmd\"" Mar 12 13:39:11.870213 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.870084 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Mar 12 13:39:11.870422 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.870282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Mar 12 13:39:11.870422 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.870331 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Mar 12 13:39:11.894616 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.894593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9"] Mar 12 13:39:11.985125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-tls\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-metrics-client-ca\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd28m\" (UniqueName: \"kubernetes.io/projected/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-kube-api-access-dd28m\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-grpc-tls\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985664 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:11.985664 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:11.985389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.085958 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.085924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.085981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.086032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.086074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-tls\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.086092 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-metrics-client-ca\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.086110 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dd28m\" (UniqueName: \"kubernetes.io/projected/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-kube-api-access-dd28m\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.086129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-grpc-tls\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.086377 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.086166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.088632 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.088565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-metrics-client-ca\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.093070 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.093040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.094011 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.093984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-grpc-tls\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.096424 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.096377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.099159 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.099112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd28m\" (UniqueName: \"kubernetes.io/projected/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-kube-api-access-dd28m\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.100086 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.100045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.100442 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.100423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-tls\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.102019 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.101981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/053e32cf-a4f5-4fbb-a8f3-df217ce3e68d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b6c584c7d-tmkl9\" (UID: \"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d\") " pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:12.180885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:12.180831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:13.018869 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.018836 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9"] Mar 12 13:39:13.049974 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:13.049943 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053e32cf_a4f5_4fbb_a8f3_df217ce3e68d.slice/crio-709adba67fbc114f01e30f9d0a19fbf68e8f2bd2a718cc5151afa90a3d845ade WatchSource:0}: Error finding container 709adba67fbc114f01e30f9d0a19fbf68e8f2bd2a718cc5151afa90a3d845ade: Status 404 returned error can't find the container with id 709adba67fbc114f01e30f9d0a19fbf68e8f2bd2a718cc5151afa90a3d845ade Mar 12 13:39:13.244124 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.244090 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-649779b44-9xxmf"] Mar 12 13:39:13.248238 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.248215 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.251319 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.251135 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Mar 12 13:39:13.251319 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.251188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Mar 12 13:39:13.251907 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.251686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-gmwg5\"" Mar 12 13:39:13.251907 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.251764 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Mar 12 13:39:13.251907 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.251812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Mar 12 13:39:13.252290 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.252267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dbas421macbvv\"" Mar 12 13:39:13.256750 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.256726 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-649779b44-9xxmf"] Mar 12 13:39:13.298349 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-client-ca-bundle\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.298546 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298370 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-secret-metrics-server-tls\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.298546 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-secret-metrics-server-client-certs\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.298546 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l428q\" (UniqueName: \"kubernetes.io/projected/e434420d-6db4-40f3-8792-88d3b03b0e31-kube-api-access-l428q\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.298692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e434420d-6db4-40f3-8792-88d3b03b0e31-metrics-server-audit-profiles\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.298692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e434420d-6db4-40f3-8792-88d3b03b0e31-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.298692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.298637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e434420d-6db4-40f3-8792-88d3b03b0e31-audit-log\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399506 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399444 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e434420d-6db4-40f3-8792-88d3b03b0e31-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399693 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e434420d-6db4-40f3-8792-88d3b03b0e31-audit-log\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399693 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399589 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-client-ca-bundle\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399693 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-secret-metrics-server-tls\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399693 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-secret-metrics-server-client-certs\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l428q\" (UniqueName: \"kubernetes.io/projected/e434420d-6db4-40f3-8792-88d3b03b0e31-kube-api-access-l428q\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.399921 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.399763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e434420d-6db4-40f3-8792-88d3b03b0e31-metrics-server-audit-profiles\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.400768 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.400317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e434420d-6db4-40f3-8792-88d3b03b0e31-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.400768 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.400396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e434420d-6db4-40f3-8792-88d3b03b0e31-audit-log\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.400945 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.400807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e434420d-6db4-40f3-8792-88d3b03b0e31-metrics-server-audit-profiles\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.403192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.403169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-secret-metrics-server-tls\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.403444 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.403427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-secret-metrics-server-client-certs\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.403536 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.403440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e434420d-6db4-40f3-8792-88d3b03b0e31-client-ca-bundle\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.411037 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.411013 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l428q\" (UniqueName: \"kubernetes.io/projected/e434420d-6db4-40f3-8792-88d3b03b0e31-kube-api-access-l428q\") pod \"metrics-server-649779b44-9xxmf\" (UID: \"e434420d-6db4-40f3-8792-88d3b03b0e31\") " pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.549471 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.549369 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr"] Mar 12 13:39:13.552148 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.552123 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c" exitCode=0 Mar 12 13:39:13.554216 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.554195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"709adba67fbc114f01e30f9d0a19fbf68e8f2bd2a718cc5151afa90a3d845ade"} Mar 12 13:39:13.554320 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.554219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c"} Mar 12 13:39:13.554320 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.554233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" event={"ID":"35520300-1852-426c-b98c-0d7a5de8f5d1","Type":"ContainerStarted","Data":"e7e294c341d241b23dcff9c133c33c620ebca42071e30e28c125a968eedce59d"} Mar 12 13:39:13.554320 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.554247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" event={"ID":"35520300-1852-426c-b98c-0d7a5de8f5d1","Type":"ContainerStarted","Data":"46e2f0d9e619ac0fb9e16b18755e75a9a3b7aa4f19b059aa8bbfc4a42d527e10"} Mar 12 13:39:13.554515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.554499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:13.557826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.557571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fsccl\"" Mar 12 13:39:13.557826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.557571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Mar 12 13:39:13.562312 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.561904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:13.564446 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.564408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr"] Mar 12 13:39:13.601799 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.601742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/56935d74-a7ce-4152-8c4e-752ce08b5343-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-jvgbr\" (UID: \"56935d74-a7ce-4152-8c4e-752ce08b5343\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:13.628659 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.628623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78566ffc85-qq6b4"] Mar 12 13:39:13.632665 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.632640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.638569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.637453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Mar 12 13:39:13.638569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.637453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Mar 12 13:39:13.638569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.637557 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Mar 12 13:39:13.638569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.637555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Mar 12 13:39:13.638569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.637693 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-mksw2\"" Mar 12 13:39:13.638569 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.638010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Mar 12 13:39:13.649302 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.649262 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78566ffc85-qq6b4"] Mar 12 13:39:13.703363 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbl27\" (UniqueName: \"kubernetes.io/projected/437239e7-0e09-41f9-8d00-4d1271f0db28-kube-api-access-tbl27\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.703588 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-oauth-config\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.703588 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-service-ca\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.703713 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/56935d74-a7ce-4152-8c4e-752ce08b5343-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-jvgbr\" (UID: \"56935d74-a7ce-4152-8c4e-752ce08b5343\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:13.703713 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-console-config\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.703713 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703673 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-serving-cert\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.703833 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.703744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-oauth-serving-cert\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.706710 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.706683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/56935d74-a7ce-4152-8c4e-752ce08b5343-monitoring-plugin-cert\") pod \"monitoring-plugin-6d47bdb78d-jvgbr\" (UID: \"56935d74-a7ce-4152-8c4e-752ce08b5343\") " pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:13.805307 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.805221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-console-config\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.805307 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.805273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-serving-cert\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.805576 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.805332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-oauth-serving-cert\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.805576 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.805375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbl27\" (UniqueName: \"kubernetes.io/projected/437239e7-0e09-41f9-8d00-4d1271f0db28-kube-api-access-tbl27\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.805576 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.805421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-oauth-config\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.805576 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.805445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-service-ca\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.806050 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.806002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-console-config\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.806512 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.806254 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-service-ca\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.806813 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.806792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-oauth-serving-cert\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.808408 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.808383 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-oauth-config\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.808587 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.808552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-serving-cert\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.814231 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.814209 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbl27\" (UniqueName: \"kubernetes.io/projected/437239e7-0e09-41f9-8d00-4d1271f0db28-kube-api-access-tbl27\") pod \"console-78566ffc85-qq6b4\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:13.869446 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.869407 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:13.947703 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:13.947667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:14.161226 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.161188 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt"] Mar 12 13:39:14.166489 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.166424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.169319 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.169142 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Mar 12 13:39:14.169319 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.169148 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Mar 12 13:39:14.169769 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.169750 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Mar 12 13:39:14.169870 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.169754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Mar 12 13:39:14.170514 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.170494 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-j2wdc\"" Mar 12 13:39:14.170619 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.170501 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Mar 12 13:39:14.180741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.180718 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Mar 12 13:39:14.186057 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.186036 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt"] Mar 12 13:39:14.209274 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-metrics-client-ca\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209417 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-secret-telemeter-client\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209509 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209565 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gllx\" (UniqueName: \"kubernetes.io/projected/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-kube-api-access-7gllx\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209615 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-telemeter-client-tls\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209615 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209714 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209638 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-serving-certs-ca-bundle\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.209714 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.209663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-federate-client-tls\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310361 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310323 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-metrics-client-ca\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310375 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-secret-telemeter-client\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gllx\" (UniqueName: \"kubernetes.io/projected/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-kube-api-access-7gllx\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-telemeter-client-tls\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-serving-certs-ca-bundle\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.310808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.310608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-federate-client-tls\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.311414 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.311384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-metrics-client-ca\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.311561 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.311420 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-serving-certs-ca-bundle\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.312038 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.311992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.314729 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.314710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-secret-telemeter-client\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.314812 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.314733 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-telemeter-client-tls\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.314946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.314918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.316248 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.316228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-federate-client-tls\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.320044 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.320021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gllx\" (UniqueName: \"kubernetes.io/projected/8ce11cd6-c7dc-489b-8f06-caf49da9ae97-kube-api-access-7gllx\") pod \"telemeter-client-5d66bcc7f4-l7sjt\" (UID: \"8ce11cd6-c7dc-489b-8f06-caf49da9ae97\") " pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:14.445765 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.445677 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:39:14.479510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:14.479450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" Mar 12 13:39:15.077004 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.075871 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:39:15.087948 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.087681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.091623 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.091808 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.091897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.091993 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.092153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.092363 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.092662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.092782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.093582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gmcz9\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.094708 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 12 13:39:15.095552 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.095095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 12 13:39:15.096079 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.095898 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 12 13:39:15.096135 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.096103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3fq79u52touef\"" Mar 12 13:39:15.098089 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.098065 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 12 13:39:15.099598 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.099556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 12 13:39:15.220026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.219989 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-config-out\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkps5\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-kube-api-access-nkps5\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-config\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220367 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220367 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220367 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220387 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-web-config\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220737 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220737 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220737 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220737 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220737 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220949 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220745 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.220949 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.220774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.321404 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.321362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-config-out\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.321859 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.321415 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkps5\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-kube-api-access-nkps5\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.321859 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.321482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-config\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.321859 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.321502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.321859 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.321833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.322088 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.321906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.322493 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.322344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.322493 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.322384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.322493 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.322411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.322493 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.322445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-web-config\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323283 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.323564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.324538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.324849 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.324559 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-config-out\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.326123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.326655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.326693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.327273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.327514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.327549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.328072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.329801 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.328780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.330589 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.330564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-config\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.330825 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.330800 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.331115 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.331055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.331636 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.331597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.332863 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.332822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-web-config\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.334018 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.333996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.334178 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.334152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.340065 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.340045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkps5\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-kube-api-access-nkps5\") pod \"prometheus-k8s-0\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:15.407036 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:15.406998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:18.811305 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:18.811271 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-658bf4bccb-bwzwj"] Mar 12 13:39:20.553754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.553722 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt"] Mar 12 13:39:20.555554 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:20.555487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce11cd6_c7dc_489b_8f06_caf49da9ae97.slice/crio-abbe946576d81cf8c5946e8fb59d82e6b7238bd6406ef7031e2992a83fc696bd WatchSource:0}: Error finding container abbe946576d81cf8c5946e8fb59d82e6b7238bd6406ef7031e2992a83fc696bd: Status 404 returned error can't find the container with id abbe946576d81cf8c5946e8fb59d82e6b7238bd6406ef7031e2992a83fc696bd Mar 12 13:39:20.581854 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.581816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" event={"ID":"8ce11cd6-c7dc-489b-8f06-caf49da9ae97","Type":"ContainerStarted","Data":"abbe946576d81cf8c5946e8fb59d82e6b7238bd6406ef7031e2992a83fc696bd"} Mar 12 13:39:20.583967 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.583876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" event={"ID":"35520300-1852-426c-b98c-0d7a5de8f5d1","Type":"ContainerStarted","Data":"9731e9c408ab2017dc23464a6addb2b909a75367de1ca0231f2436bb8b3d9a6f"} Mar 12 13:39:20.605866 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.605807 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-6df7999c47-f2ksk" podStartSLOduration=10.77406329 podStartE2EDuration="12.605790943s" podCreationTimestamp="2026-03-12 13:39:08 +0000 UTC" firstStartedPulling="2026-03-12 13:39:11.028340719 +0000 UTC m=+86.540274728" lastFinishedPulling="2026-03-12 13:39:12.860068375 +0000 UTC m=+88.372002381" observedRunningTime="2026-03-12 13:39:20.604953804 +0000 UTC m=+96.116887832" watchObservedRunningTime="2026-03-12 13:39:20.605790943 +0000 UTC m=+96.117725004" Mar 12 13:39:20.774208 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.774136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr"] Mar 12 13:39:20.776749 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:20.776713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56935d74_a7ce_4152_8c4e_752ce08b5343.slice/crio-03b31bc1539e25babae8c9e1181ed31d501005f0fe21060b7a3c5c43805b4c29 WatchSource:0}: Error finding container 03b31bc1539e25babae8c9e1181ed31d501005f0fe21060b7a3c5c43805b4c29: Status 404 returned error can't find the container with id 03b31bc1539e25babae8c9e1181ed31d501005f0fe21060b7a3c5c43805b4c29 Mar 12 13:39:20.788400 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.788371 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:39:20.793532 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:20.793502 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod507c80a4_ea62_42d8_a41a_850073f8db0c.slice/crio-ce7adcae58e11af6c6326b178efe7407bb99b268456ab1f23c155dbb4539f614 WatchSource:0}: Error finding container ce7adcae58e11af6c6326b178efe7407bb99b268456ab1f23c155dbb4539f614: Status 404 returned error can't find the container with id ce7adcae58e11af6c6326b178efe7407bb99b268456ab1f23c155dbb4539f614 Mar 12 13:39:20.811125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.811079 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-649779b44-9xxmf"] Mar 12 13:39:20.814874 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:20.814607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78566ffc85-qq6b4"] Mar 12 13:39:20.818579 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:20.818538 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode434420d_6db4_40f3_8792_88d3b03b0e31.slice/crio-a5e167c06113489c7e5649f5c1a80a465c32fff1a1a7ca2d7daa17d9a1d51928 WatchSource:0}: Error finding container a5e167c06113489c7e5649f5c1a80a465c32fff1a1a7ca2d7daa17d9a1d51928: Status 404 returned error can't find the container with id a5e167c06113489c7e5649f5c1a80a465c32fff1a1a7ca2d7daa17d9a1d51928 Mar 12 13:39:20.819969 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:39:20.819903 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437239e7_0e09_41f9_8d00_4d1271f0db28.slice/crio-45e03b5a53ea296398eea12ebb8c30f39ce3947025615f5b7c1b77b565ea52f4 WatchSource:0}: Error finding container 45e03b5a53ea296398eea12ebb8c30f39ce3947025615f5b7c1b77b565ea52f4: Status 404 returned error can't find the container with id 45e03b5a53ea296398eea12ebb8c30f39ce3947025615f5b7c1b77b565ea52f4 Mar 12 13:39:21.594114 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.592790 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" exitCode=0 Mar 12 13:39:21.594114 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.592896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} Mar 12 13:39:21.594114 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.592929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"ce7adcae58e11af6c6326b178efe7407bb99b268456ab1f23c155dbb4539f614"} Mar 12 13:39:21.602976 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.602916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-5b85974fd6-sqphd" event={"ID":"4dac3f26-1539-4a57-8570-dba478ffb63f","Type":"ContainerStarted","Data":"2e81a7029c2cca99c5333ad67880879ead162288410f12c692d5393354eb57f1"} Mar 12 13:39:21.604372 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.604171 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:21.606078 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.606014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" event={"ID":"e434420d-6db4-40f3-8792-88d3b03b0e31","Type":"ContainerStarted","Data":"a5e167c06113489c7e5649f5c1a80a465c32fff1a1a7ca2d7daa17d9a1d51928"} Mar 12 13:39:21.608655 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.608625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" event={"ID":"56935d74-a7ce-4152-8c4e-752ce08b5343","Type":"ContainerStarted","Data":"03b31bc1539e25babae8c9e1181ed31d501005f0fe21060b7a3c5c43805b4c29"} Mar 12 13:39:21.611735 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.611703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78566ffc85-qq6b4" event={"ID":"437239e7-0e09-41f9-8d00-4d1271f0db28","Type":"ContainerStarted","Data":"45e03b5a53ea296398eea12ebb8c30f39ce3947025615f5b7c1b77b565ea52f4"} Mar 12 13:39:21.617868 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:21.617827 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-5b85974fd6-sqphd" Mar 12 13:39:23.442970 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:23.442682 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fnfhc" Mar 12 13:39:23.463535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:23.463449 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-5b85974fd6-sqphd" podStartSLOduration=3.94557455 podStartE2EDuration="21.463427869s" podCreationTimestamp="2026-03-12 13:39:02 +0000 UTC" firstStartedPulling="2026-03-12 13:39:03.399430383 +0000 UTC m=+78.911364393" lastFinishedPulling="2026-03-12 13:39:20.917283702 +0000 UTC m=+96.429217712" observedRunningTime="2026-03-12 13:39:21.691933475 +0000 UTC m=+97.203867502" watchObservedRunningTime="2026-03-12 13:39:23.463427869 +0000 UTC m=+98.975361897" Mar 12 13:39:26.911030 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:26.910962 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78566ffc85-qq6b4"] Mar 12 13:39:29.654327 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.654299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"3c9eba4a12543d1fa7ce147f12460986f1bb47536e53f65e5fb0bc24efc9a6c4"} Mar 12 13:39:29.656281 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.656257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3"} Mar 12 13:39:29.657429 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.657405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" event={"ID":"56935d74-a7ce-4152-8c4e-752ce08b5343","Type":"ContainerStarted","Data":"70f58943c413237ae7ff240afd4d9a7af17515dca1ccf57183775fd388723490"} Mar 12 13:39:29.658675 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.658655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" event={"ID":"8ce11cd6-c7dc-489b-8f06-caf49da9ae97","Type":"ContainerStarted","Data":"8742c6ca11feb01c374678ce4e3c272ad98edf5751fb47e8e6c443d011ced2bc"} Mar 12 13:39:29.665415 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.665298 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:29.670766 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.670743 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" Mar 12 13:39:29.687655 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:29.687611 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6d47bdb78d-jvgbr" podStartSLOduration=8.168808101 podStartE2EDuration="16.687594746s" podCreationTimestamp="2026-03-12 13:39:13 +0000 UTC" firstStartedPulling="2026-03-12 13:39:20.781820638 +0000 UTC m=+96.293754665" lastFinishedPulling="2026-03-12 13:39:29.300607306 +0000 UTC m=+104.812541310" observedRunningTime="2026-03-12 13:39:29.685582983 +0000 UTC m=+105.197517010" watchObservedRunningTime="2026-03-12 13:39:29.687594746 +0000 UTC m=+105.199528773" Mar 12 13:39:30.668400 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.668364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78566ffc85-qq6b4" event={"ID":"437239e7-0e09-41f9-8d00-4d1271f0db28","Type":"ContainerStarted","Data":"8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb"} Mar 12 13:39:30.670539 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.670439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" event={"ID":"8ce11cd6-c7dc-489b-8f06-caf49da9ae97","Type":"ContainerStarted","Data":"c2abb20bac055ae98a7e539b2fc2dc3af4977c6ee7c9988ad3b1eef903ac7490"} Mar 12 13:39:30.670539 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.670515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" event={"ID":"8ce11cd6-c7dc-489b-8f06-caf49da9ae97","Type":"ContainerStarted","Data":"506747053c9608ce04edaa16c701f726db983230143d8c3f014be5da008a8aa0"} Mar 12 13:39:30.674820 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.674793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} Mar 12 13:39:30.674820 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.674822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} Mar 12 13:39:30.675004 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.674831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} Mar 12 13:39:30.675004 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.674839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} Mar 12 13:39:30.675004 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.674849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} Mar 12 13:39:30.675004 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.674857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerStarted","Data":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} Mar 12 13:39:30.676852 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.676808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"e52a2bbd20aea6a376895d81a9dcad8549c0937f1ef6fd55db9fb4257c674dd2"} Mar 12 13:39:30.676852 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.676837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"8fe7be5eabb2c62e33d8ee80203ba1cff98b827480bed71ec921c8224c70b72c"} Mar 12 13:39:30.679708 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.679670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4"} Mar 12 13:39:30.679826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.679711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc"} Mar 12 13:39:30.679826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.679723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c"} Mar 12 13:39:30.679826 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.679736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69"} Mar 12 13:39:30.681298 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.681275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" event={"ID":"e434420d-6db4-40f3-8792-88d3b03b0e31","Type":"ContainerStarted","Data":"c9bba697d05b98749b46b4b7e7e44949c20f95bd34be4fe6ff4eddd65c1fb3f1"} Mar 12 13:39:30.689991 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.689943 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78566ffc85-qq6b4" podStartSLOduration=9.213176216 podStartE2EDuration="17.689924553s" podCreationTimestamp="2026-03-12 13:39:13 +0000 UTC" firstStartedPulling="2026-03-12 13:39:20.825618455 +0000 UTC m=+96.337552460" lastFinishedPulling="2026-03-12 13:39:29.302366778 +0000 UTC m=+104.814300797" observedRunningTime="2026-03-12 13:39:30.686903219 +0000 UTC m=+106.198837247" watchObservedRunningTime="2026-03-12 13:39:30.689924553 +0000 UTC m=+106.201858580" Mar 12 13:39:30.706031 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.705985 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" podStartSLOduration=9.224672576 podStartE2EDuration="17.705968676s" podCreationTimestamp="2026-03-12 13:39:13 +0000 UTC" firstStartedPulling="2026-03-12 13:39:20.823800286 +0000 UTC m=+96.335734294" lastFinishedPulling="2026-03-12 13:39:29.305096374 +0000 UTC m=+104.817030394" observedRunningTime="2026-03-12 13:39:30.703791429 +0000 UTC m=+106.215725457" watchObservedRunningTime="2026-03-12 13:39:30.705968676 +0000 UTC m=+106.217902705" Mar 12 13:39:30.735309 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.735244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=7.682651258 podStartE2EDuration="15.735227067s" podCreationTimestamp="2026-03-12 13:39:15 +0000 UTC" firstStartedPulling="2026-03-12 13:39:21.595868025 +0000 UTC m=+97.107802035" lastFinishedPulling="2026-03-12 13:39:29.648443829 +0000 UTC m=+105.160377844" observedRunningTime="2026-03-12 13:39:30.732239452 +0000 UTC m=+106.244173479" watchObservedRunningTime="2026-03-12 13:39:30.735227067 +0000 UTC m=+106.247161095" Mar 12 13:39:30.754078 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:30.754003 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5d66bcc7f4-l7sjt" podStartSLOduration=8.022646605 podStartE2EDuration="16.75398757s" podCreationTimestamp="2026-03-12 13:39:14 +0000 UTC" firstStartedPulling="2026-03-12 13:39:20.56926767 +0000 UTC m=+96.081201678" lastFinishedPulling="2026-03-12 13:39:29.300608634 +0000 UTC m=+104.812542643" observedRunningTime="2026-03-12 13:39:30.753332256 +0000 UTC m=+106.265266284" watchObservedRunningTime="2026-03-12 13:39:30.75398757 +0000 UTC m=+106.265921596" Mar 12 13:39:31.687523 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.687486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"970f00e1f8e7f3c47360babdfebb0653a43c935961934298c79c320ad8547255"} Mar 12 13:39:31.687523 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.687527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"f3f6878dbb4d4a936863cb6614da489e7b3cf0dc964ba015426d327b18256f7b"} Mar 12 13:39:31.688102 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.687540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" event={"ID":"053e32cf-a4f5-4fbb-a8f3-df217ce3e68d","Type":"ContainerStarted","Data":"4cd72ad057927479f35560757d67950eaa12ba2f299ec64697fff5b96ccfd992"} Mar 12 13:39:31.688102 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.687696 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:31.690607 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.690581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerStarted","Data":"b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347"} Mar 12 13:39:31.720355 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.720304 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" podStartSLOduration=2.645747264 podStartE2EDuration="20.720286079s" podCreationTimestamp="2026-03-12 13:39:11 +0000 UTC" firstStartedPulling="2026-03-12 13:39:13.052389002 +0000 UTC m=+88.564323010" lastFinishedPulling="2026-03-12 13:39:31.126927817 +0000 UTC m=+106.638861825" observedRunningTime="2026-03-12 13:39:31.717866595 +0000 UTC m=+107.229800621" watchObservedRunningTime="2026-03-12 13:39:31.720286079 +0000 UTC m=+107.232220106" Mar 12 13:39:31.763101 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:31.763045 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.528335487 podStartE2EDuration="22.763027319s" podCreationTimestamp="2026-03-12 13:39:09 +0000 UTC" firstStartedPulling="2026-03-12 13:39:11.013201144 +0000 UTC m=+86.525135148" lastFinishedPulling="2026-03-12 13:39:31.24789297 +0000 UTC m=+106.759826980" observedRunningTime="2026-03-12 13:39:31.762232689 +0000 UTC m=+107.274166753" watchObservedRunningTime="2026-03-12 13:39:31.763027319 +0000 UTC m=+107.274961347" Mar 12 13:39:33.562550 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:33.562515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:33.562550 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:33.562558 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:33.948295 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:33.948255 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:35.408002 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:35.407966 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:39:35.706271 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:35.706183 2576 generic.go:358] "Generic (PLEG): container finished" podID="20ecc6d8-57fe-48c1-bbcf-886289ff3155" containerID="b2ccf4ecbe41c6550b3914f1aaa5845b022a8ea3e90c617ff7c4f50efff2e121" exitCode=0 Mar 12 13:39:35.706417 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:35.706284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" event={"ID":"20ecc6d8-57fe-48c1-bbcf-886289ff3155","Type":"ContainerDied","Data":"b2ccf4ecbe41c6550b3914f1aaa5845b022a8ea3e90c617ff7c4f50efff2e121"} Mar 12 13:39:35.706699 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:35.706686 2576 scope.go:117] "RemoveContainer" containerID="b2ccf4ecbe41c6550b3914f1aaa5845b022a8ea3e90c617ff7c4f50efff2e121" Mar 12 13:39:36.719527 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:36.719492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-866f46547-wlnh5" event={"ID":"20ecc6d8-57fe-48c1-bbcf-886289ff3155","Type":"ContainerStarted","Data":"2853e4486831290c981a1051efe1d44014f747f27e96a0eaa14e8969604868fb"} Mar 12 13:39:37.700510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:37.700475 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7b6c584c7d-tmkl9" Mar 12 13:39:43.833808 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:43.833664 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" podUID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" containerName="registry" containerID="cri-o://1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74" gracePeriod=30 Mar 12 13:39:44.111962 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.111939 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:39:44.241957 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.241920 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-installation-pull-secrets\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242137 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.241986 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242137 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242041 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/327ec7eb-a5de-4709-9fbc-6bbe58d87674-ca-trust-extracted\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242137 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242075 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-bound-sa-token\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242137 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cngd\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-kube-api-access-6cngd\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242403 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242175 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-image-registry-private-configuration\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242403 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242215 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-certificates\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.242403 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242271 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-trusted-ca\") pod \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\" (UID: \"327ec7eb-a5de-4709-9fbc-6bbe58d87674\") " Mar 12 13:39:44.243040 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.242969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:44.243504 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.243336 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:44.244864 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.244833 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:39:44.244998 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.244968 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:39:44.245058 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.245027 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-kube-api-access-6cngd" (OuterVolumeSpecName: "kube-api-access-6cngd") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "kube-api-access-6cngd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:44.245145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.245112 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:44.245297 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.245280 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:44.253750 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.253720 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/327ec7eb-a5de-4709-9fbc-6bbe58d87674-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "327ec7eb-a5de-4709-9fbc-6bbe58d87674" (UID: "327ec7eb-a5de-4709-9fbc-6bbe58d87674"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.343978 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6cngd\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-kube-api-access-6cngd\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344016 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-image-registry-private-configuration\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344029 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-certificates\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344040 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/327ec7eb-a5de-4709-9fbc-6bbe58d87674-trusted-ca\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344050 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/327ec7eb-a5de-4709-9fbc-6bbe58d87674-installation-pull-secrets\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344060 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-registry-tls\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344069 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/327ec7eb-a5de-4709-9fbc-6bbe58d87674-ca-trust-extracted\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.344075 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.344079 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/327ec7eb-a5de-4709-9fbc-6bbe58d87674-bound-sa-token\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:44.749148 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.749111 2576 generic.go:358] "Generic (PLEG): container finished" podID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" containerID="1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74" exitCode=0 Mar 12 13:39:44.749361 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.749174 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" Mar 12 13:39:44.749361 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.749196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" event={"ID":"327ec7eb-a5de-4709-9fbc-6bbe58d87674","Type":"ContainerDied","Data":"1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74"} Mar 12 13:39:44.749361 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.749235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-658bf4bccb-bwzwj" event={"ID":"327ec7eb-a5de-4709-9fbc-6bbe58d87674","Type":"ContainerDied","Data":"9c04d00ee434bbee8c6ad1cfffc57e83fb459da0ab3a0570c97b080b2dc71ac1"} Mar 12 13:39:44.749361 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.749253 2576 scope.go:117] "RemoveContainer" containerID="1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74" Mar 12 13:39:44.758715 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.758696 2576 scope.go:117] "RemoveContainer" containerID="1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74" Mar 12 13:39:44.758977 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:44.758957 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74\": container with ID starting with 1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74 not found: ID does not exist" containerID="1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74" Mar 12 13:39:44.759036 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.758987 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74"} err="failed to get container status \"1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74\": rpc error: code = NotFound desc = could not find container \"1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74\": container with ID starting with 1bc363650ac690749a6bce2007c1f5b0a6d2c66a39178ec99d6454eacb9b2c74 not found: ID does not exist" Mar 12 13:39:44.772510 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.772484 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-658bf4bccb-bwzwj"] Mar 12 13:39:44.777143 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:44.777120 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-658bf4bccb-bwzwj"] Mar 12 13:39:45.092008 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:45.091971 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" path="/var/lib/kubelet/pods/327ec7eb-a5de-4709-9fbc-6bbe58d87674/volumes" Mar 12 13:39:45.754084 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:45.754047 2576 generic.go:358] "Generic (PLEG): container finished" podID="16e1b1af-b744-44fd-a3d5-c817249f6767" containerID="03b5be3f7da3c73290aeeb86c3976e9b87b970da5782e24dc82f797141d12785" exitCode=0 Mar 12 13:39:45.754287 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:45.754123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" event={"ID":"16e1b1af-b744-44fd-a3d5-c817249f6767","Type":"ContainerDied","Data":"03b5be3f7da3c73290aeeb86c3976e9b87b970da5782e24dc82f797141d12785"} Mar 12 13:39:45.754593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:45.754576 2576 scope.go:117] "RemoveContainer" containerID="03b5be3f7da3c73290aeeb86c3976e9b87b970da5782e24dc82f797141d12785" Mar 12 13:39:46.287709 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:46.287679 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f747dc4bb-qfvq7_c11ab84b-7d66-4e49-a456-66c0ee53f865/router/0.log" Mar 12 13:39:46.328254 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:46.328225 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v9hjg_92914013-06d0-470c-b843-48a641e447a6/serve-healthcheck-canary/0.log" Mar 12 13:39:46.759798 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:46.759762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-76bdd9f478-q8flt" event={"ID":"16e1b1af-b744-44fd-a3d5-c817249f6767","Type":"ContainerStarted","Data":"a19605f34cdf9e13e44d435b0db96ffb9becb6325d78ee3a25bffd6ff077b52f"} Mar 12 13:39:53.567775 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:53.567741 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:53.571856 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:53.571830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-649779b44-9xxmf" Mar 12 13:39:55.693064 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:55.692997 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78566ffc85-qq6b4" podUID="437239e7-0e09-41f9-8d00-4d1271f0db28" containerName="console" containerID="cri-o://8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb" gracePeriod=15 Mar 12 13:39:55.802758 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:55.802721 2576 generic.go:358] "Generic (PLEG): container finished" podID="c2666fac-29e4-411f-a8eb-22c3b7d96cbf" containerID="5e3ae8dc965c47594038e41dc486439c180e32c02165fb1a0f8258ba07b89ffc" exitCode=0 Mar 12 13:39:55.802926 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:55.802795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" event={"ID":"c2666fac-29e4-411f-a8eb-22c3b7d96cbf","Type":"ContainerDied","Data":"5e3ae8dc965c47594038e41dc486439c180e32c02165fb1a0f8258ba07b89ffc"} Mar 12 13:39:55.803261 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:55.803245 2576 scope.go:117] "RemoveContainer" containerID="5e3ae8dc965c47594038e41dc486439c180e32c02165fb1a0f8258ba07b89ffc" Mar 12 13:39:55.985978 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:55.985949 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78566ffc85-qq6b4_437239e7-0e09-41f9-8d00-4d1271f0db28/console/0.log" Mar 12 13:39:55.986129 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:55.986030 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:56.053870 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.053824 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-console-config\") pod \"437239e7-0e09-41f9-8d00-4d1271f0db28\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " Mar 12 13:39:56.054082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.053910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-oauth-config\") pod \"437239e7-0e09-41f9-8d00-4d1271f0db28\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " Mar 12 13:39:56.054082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.053938 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-service-ca\") pod \"437239e7-0e09-41f9-8d00-4d1271f0db28\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " Mar 12 13:39:56.054082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.053969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbl27\" (UniqueName: \"kubernetes.io/projected/437239e7-0e09-41f9-8d00-4d1271f0db28-kube-api-access-tbl27\") pod \"437239e7-0e09-41f9-8d00-4d1271f0db28\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " Mar 12 13:39:56.054082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.054036 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-oauth-serving-cert\") pod \"437239e7-0e09-41f9-8d00-4d1271f0db28\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " Mar 12 13:39:56.054082 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.054071 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-serving-cert\") pod \"437239e7-0e09-41f9-8d00-4d1271f0db28\" (UID: \"437239e7-0e09-41f9-8d00-4d1271f0db28\") " Mar 12 13:39:56.054425 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.054394 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-console-config" (OuterVolumeSpecName: "console-config") pod "437239e7-0e09-41f9-8d00-4d1271f0db28" (UID: "437239e7-0e09-41f9-8d00-4d1271f0db28"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:56.054535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.054485 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-service-ca" (OuterVolumeSpecName: "service-ca") pod "437239e7-0e09-41f9-8d00-4d1271f0db28" (UID: "437239e7-0e09-41f9-8d00-4d1271f0db28"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:56.054535 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.054486 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "437239e7-0e09-41f9-8d00-4d1271f0db28" (UID: "437239e7-0e09-41f9-8d00-4d1271f0db28"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:39:56.056785 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.056699 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "437239e7-0e09-41f9-8d00-4d1271f0db28" (UID: "437239e7-0e09-41f9-8d00-4d1271f0db28"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:39:56.056785 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.056725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "437239e7-0e09-41f9-8d00-4d1271f0db28" (UID: "437239e7-0e09-41f9-8d00-4d1271f0db28"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:39:56.056785 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.056741 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/437239e7-0e09-41f9-8d00-4d1271f0db28-kube-api-access-tbl27" (OuterVolumeSpecName: "kube-api-access-tbl27") pod "437239e7-0e09-41f9-8d00-4d1271f0db28" (UID: "437239e7-0e09-41f9-8d00-4d1271f0db28"). InnerVolumeSpecName "kube-api-access-tbl27". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:39:56.155911 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.155866 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-console-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:56.155911 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.155903 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-oauth-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:56.155911 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.155914 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-service-ca\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:56.155911 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.155923 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbl27\" (UniqueName: \"kubernetes.io/projected/437239e7-0e09-41f9-8d00-4d1271f0db28-kube-api-access-tbl27\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:56.156186 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.155935 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/437239e7-0e09-41f9-8d00-4d1271f0db28-oauth-serving-cert\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:56.156186 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.155944 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/437239e7-0e09-41f9-8d00-4d1271f0db28-console-serving-cert\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:39:56.808743 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.808718 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78566ffc85-qq6b4_437239e7-0e09-41f9-8d00-4d1271f0db28/console/0.log" Mar 12 13:39:56.809163 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.808762 2576 generic.go:358] "Generic (PLEG): container finished" podID="437239e7-0e09-41f9-8d00-4d1271f0db28" containerID="8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb" exitCode=2 Mar 12 13:39:56.809163 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.808836 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78566ffc85-qq6b4" Mar 12 13:39:56.809163 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.808850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78566ffc85-qq6b4" event={"ID":"437239e7-0e09-41f9-8d00-4d1271f0db28","Type":"ContainerDied","Data":"8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb"} Mar 12 13:39:56.809163 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.808891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78566ffc85-qq6b4" event={"ID":"437239e7-0e09-41f9-8d00-4d1271f0db28","Type":"ContainerDied","Data":"45e03b5a53ea296398eea12ebb8c30f39ce3947025615f5b7c1b77b565ea52f4"} Mar 12 13:39:56.809163 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.808911 2576 scope.go:117] "RemoveContainer" containerID="8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb" Mar 12 13:39:56.811061 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.811037 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-56f6f4cbcb-zklgq" event={"ID":"c2666fac-29e4-411f-a8eb-22c3b7d96cbf","Type":"ContainerStarted","Data":"3de31ba8d2a38e433f7df7de305c2496d3b9fdf8eca2c0b5e2fbfaa58e55dc67"} Mar 12 13:39:56.821538 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.821507 2576 scope.go:117] "RemoveContainer" containerID="8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb" Mar 12 13:39:56.821896 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:39:56.821871 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb\": container with ID starting with 8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb not found: ID does not exist" containerID="8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb" Mar 12 13:39:56.821979 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.821906 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb"} err="failed to get container status \"8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb\": rpc error: code = NotFound desc = could not find container \"8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb\": container with ID starting with 8349f0a4fd6aa072321284814d90c76c585fdf3b8c14c1e4c78471187dbf91eb not found: ID does not exist" Mar 12 13:39:56.842204 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.842164 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78566ffc85-qq6b4"] Mar 12 13:39:56.847354 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:56.847317 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78566ffc85-qq6b4"] Mar 12 13:39:57.087003 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:39:57.086918 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437239e7-0e09-41f9-8d00-4d1271f0db28" path="/var/lib/kubelet/pods/437239e7-0e09-41f9-8d00-4d1271f0db28/volumes" Mar 12 13:40:15.407825 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:15.407777 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:15.425654 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:15.425622 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:15.910943 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:15.910910 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:29.249301 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249258 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 13:40:29.249781 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249724 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="alertmanager" containerID="cri-o://ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3" gracePeriod=120 Mar 12 13:40:29.249845 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249790 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-metric" containerID="cri-o://d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4" gracePeriod=120 Mar 12 13:40:29.249900 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249837 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-web" containerID="cri-o://adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c" gracePeriod=120 Mar 12 13:40:29.249900 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249861 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="config-reloader" containerID="cri-o://b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69" gracePeriod=120 Mar 12 13:40:29.249997 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249904 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy" containerID="cri-o://e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc" gracePeriod=120 Mar 12 13:40:29.250056 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.249831 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="prom-label-proxy" containerID="cri-o://b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347" gracePeriod=120 Mar 12 13:40:29.947847 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947812 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347" exitCode=0 Mar 12 13:40:29.947847 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947839 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4" exitCode=0 Mar 12 13:40:29.947847 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947845 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc" exitCode=0 Mar 12 13:40:29.947847 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947851 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69" exitCode=0 Mar 12 13:40:29.948125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947858 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3" exitCode=0 Mar 12 13:40:29.948125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347"} Mar 12 13:40:29.948125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4"} Mar 12 13:40:29.948125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc"} Mar 12 13:40:29.948125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69"} Mar 12 13:40:29.948125 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:29.947958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3"} Mar 12 13:40:30.515574 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.515549 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:40:30.593984 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.593877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-config-volume\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.593984 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.593937 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-web\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.593984 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.593969 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.594292 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.593998 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-cluster-tls-config\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.594292 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594028 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-main-tls\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-web-config\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594557 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-main-db\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594643 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67hzp\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-kube-api-access-67hzp\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594672 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-metric\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594704 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-config-out\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594765 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-tls-assets\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594809 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.594833 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-metrics-client-ca\") pod \"ad5bfa9a-7696-439c-943a-08df7f56f525\" (UID: \"ad5bfa9a-7696-439c-943a-08df7f56f525\") " Mar 12 13:40:30.596026 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.595772 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:30.597053 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.596751 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 13:40:30.598424 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.598218 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.599307 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.598843 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.601245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.601218 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.601512 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.601491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-kube-api-access-67hzp" (OuterVolumeSpecName: "kube-api-access-67hzp") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "kube-api-access-67hzp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:40:30.601698 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.601663 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.602100 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.602069 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.605121 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.605086 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:30.605245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.605203 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:40:30.605245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.605205 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-config-out" (OuterVolumeSpecName: "config-out") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 13:40:30.607639 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.607610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.616019 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.615989 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-web-config" (OuterVolumeSpecName: "web-config") pod "ad5bfa9a-7696-439c-943a-08df7f56f525" (UID: "ad5bfa9a-7696-439c-943a-08df7f56f525"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:30.696738 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696696 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.696738 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696738 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.696738 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696749 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-cluster-tls-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696763 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-main-tls\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696779 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-web-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696790 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-main-db\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696799 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-67hzp\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-kube-api-access-67hzp\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696809 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696819 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ad5bfa9a-7696-439c-943a-08df7f56f525-config-out\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696827 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ad5bfa9a-7696-439c-943a-08df7f56f525-tls-assets\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696839 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696851 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad5bfa9a-7696-439c-943a-08df7f56f525-metrics-client-ca\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.697033 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.696861 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ad5bfa9a-7696-439c-943a-08df7f56f525-config-volume\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:30.955020 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.954986 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerID="adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c" exitCode=0 Mar 12 13:40:30.955214 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.955074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c"} Mar 12 13:40:30.955214 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.955096 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 13:40:30.955214 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.955122 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ad5bfa9a-7696-439c-943a-08df7f56f525","Type":"ContainerDied","Data":"ede5329b273c9c683b304dda6e70fb747d2dbabb23a49dfdd767b45dfafcf7cf"} Mar 12 13:40:30.955214 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.955142 2576 scope.go:117] "RemoveContainer" containerID="b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347" Mar 12 13:40:30.965014 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.964992 2576 scope.go:117] "RemoveContainer" containerID="d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4" Mar 12 13:40:30.973336 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.973317 2576 scope.go:117] "RemoveContainer" containerID="e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc" Mar 12 13:40:30.980884 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.980853 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 13:40:30.982674 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.982659 2576 scope.go:117] "RemoveContainer" containerID="adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c" Mar 12 13:40:30.987658 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.987628 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 13:40:30.991340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.991306 2576 scope.go:117] "RemoveContainer" containerID="b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69" Mar 12 13:40:30.999990 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:30.999968 2576 scope.go:117] "RemoveContainer" containerID="ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3" Mar 12 13:40:31.008686 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.008664 2576 scope.go:117] "RemoveContainer" containerID="b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c" Mar 12 13:40:31.017783 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.017756 2576 scope.go:117] "RemoveContainer" containerID="b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347" Mar 12 13:40:31.018086 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.018065 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347\": container with ID starting with b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347 not found: ID does not exist" containerID="b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347" Mar 12 13:40:31.018134 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018098 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347"} err="failed to get container status \"b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347\": rpc error: code = NotFound desc = could not find container \"b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347\": container with ID starting with b8e87edd71d984f803d5bea61004abaaa26344d41cb2841a9d90e2160b0ba347 not found: ID does not exist" Mar 12 13:40:31.018134 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018122 2576 scope.go:117] "RemoveContainer" containerID="d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4" Mar 12 13:40:31.018389 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.018373 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4\": container with ID starting with d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4 not found: ID does not exist" containerID="d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4" Mar 12 13:40:31.018438 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018393 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4"} err="failed to get container status \"d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4\": rpc error: code = NotFound desc = could not find container \"d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4\": container with ID starting with d6b3e6832c3c46880c589e64e4755e2da9c3d200a9e538dc1779c7c7ec6c45d4 not found: ID does not exist" Mar 12 13:40:31.018438 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018407 2576 scope.go:117] "RemoveContainer" containerID="e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc" Mar 12 13:40:31.018710 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.018693 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc\": container with ID starting with e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc not found: ID does not exist" containerID="e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc" Mar 12 13:40:31.018770 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018715 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc"} err="failed to get container status \"e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc\": rpc error: code = NotFound desc = could not find container \"e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc\": container with ID starting with e8d0112effd7363deb06937fe7b03547c5fc4014bacdfd4fd6555f270db17dbc not found: ID does not exist" Mar 12 13:40:31.018770 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018729 2576 scope.go:117] "RemoveContainer" containerID="adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c" Mar 12 13:40:31.018956 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.018941 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c\": container with ID starting with adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c not found: ID does not exist" containerID="adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c" Mar 12 13:40:31.018992 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018960 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c"} err="failed to get container status \"adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c\": rpc error: code = NotFound desc = could not find container \"adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c\": container with ID starting with adf3570c8799b178a009891f42d315d201793a1cce3ebd9bec551813d4b4334c not found: ID does not exist" Mar 12 13:40:31.018992 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.018974 2576 scope.go:117] "RemoveContainer" containerID="b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69" Mar 12 13:40:31.019166 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.019152 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69\": container with ID starting with b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69 not found: ID does not exist" containerID="b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69" Mar 12 13:40:31.019199 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.019169 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69"} err="failed to get container status \"b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69\": rpc error: code = NotFound desc = could not find container \"b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69\": container with ID starting with b3084481ad7103ab65efbfd05416d08781fb98487e56a9cf78393f6fc87f0e69 not found: ID does not exist" Mar 12 13:40:31.019199 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.019182 2576 scope.go:117] "RemoveContainer" containerID="ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3" Mar 12 13:40:31.019344 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.019331 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3\": container with ID starting with ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3 not found: ID does not exist" containerID="ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3" Mar 12 13:40:31.019391 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.019346 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3"} err="failed to get container status \"ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3\": rpc error: code = NotFound desc = could not find container \"ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3\": container with ID starting with ae9488a0bdbdb7ac5b6bdee5f9010cedad4535d2c56e827630c1429b8b9119b3 not found: ID does not exist" Mar 12 13:40:31.019391 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.019357 2576 scope.go:117] "RemoveContainer" containerID="b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c" Mar 12 13:40:31.019586 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:31.019572 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c\": container with ID starting with b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c not found: ID does not exist" containerID="b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c" Mar 12 13:40:31.019622 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.019592 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c"} err="failed to get container status \"b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c\": rpc error: code = NotFound desc = could not find container \"b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c\": container with ID starting with b83fbecdc061138b496862fbbddaf5e95863f8a7ef72ec3ca33eb8339942af1c not found: ID does not exist" Mar 12 13:40:31.086771 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:31.086736 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" path="/var/lib/kubelet/pods/ad5bfa9a-7696-439c-943a-08df7f56f525/volumes" Mar 12 13:40:33.454798 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.454704 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:40:33.455249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.455197 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="prometheus" containerID="cri-o://868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" gracePeriod=600 Mar 12 13:40:33.455337 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.455228 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="thanos-sidecar" containerID="cri-o://1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" gracePeriod=600 Mar 12 13:40:33.455337 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.455248 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="config-reloader" containerID="cri-o://450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" gracePeriod=600 Mar 12 13:40:33.455337 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.455278 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" gracePeriod=600 Mar 12 13:40:33.455513 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.455223 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy" containerID="cri-o://58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" gracePeriod=600 Mar 12 13:40:33.455579 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.455489 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-web" containerID="cri-o://b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" gracePeriod=600 Mar 12 13:40:33.758522 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.758128 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:33.827089 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827052 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827089 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827099 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-kube-rbac-proxy\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827119 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-metrics-client-ca\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827153 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-tls-assets\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827452 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827452 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827428 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-kubelet-serving-ca-bundle\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827581 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827485 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkps5\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-kube-api-access-nkps5\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827581 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827521 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-rulefiles-0\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827581 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827548 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-thanos-prometheus-http-client-file\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827564 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:33.827722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827613 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-trusted-ca-bundle\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827722 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827644 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-grpc-tls\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827744 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-config-out\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827773 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-web-config\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827799 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-tls\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827827 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-config\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.827871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827825 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:33.827871 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-db\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.828139 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827879 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-serving-certs-ca-bundle\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.828139 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.827909 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-metrics-client-certs\") pod \"507c80a4-ea62-42d8-a41a-850073f8db0c\" (UID: \"507c80a4-ea62-42d8-a41a-850073f8db0c\") " Mar 12 13:40:33.828258 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.828228 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.828258 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.828253 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-metrics-client-ca\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.830824 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.830371 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:33.830824 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.830387 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.831347 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.831235 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:40:33.831347 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.831274 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:33.831548 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.831520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 12 13:40:33.832159 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.832115 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 13:40:33.832547 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.832501 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.832649 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.832614 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.832751 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.832730 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.832834 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.832802 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.833266 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.833246 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.834692 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.834661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-kube-api-access-nkps5" (OuterVolumeSpecName: "kube-api-access-nkps5") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "kube-api-access-nkps5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:40:33.835066 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.835036 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-config" (OuterVolumeSpecName: "config") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.835197 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.835176 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-config-out" (OuterVolumeSpecName: "config-out") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Mar 12 13:40:33.836218 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.836186 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.848740 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.848708 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-web-config" (OuterVolumeSpecName: "web-config") pod "507c80a4-ea62-42d8-a41a-850073f8db0c" (UID: "507c80a4-ea62-42d8-a41a-850073f8db0c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 12 13:40:33.929062 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929019 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-tls-assets\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929062 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929056 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929062 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929068 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nkps5\" (UniqueName: \"kubernetes.io/projected/507c80a4-ea62-42d8-a41a-850073f8db0c-kube-api-access-nkps5\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929080 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929090 2576 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929101 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929110 2576 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-grpc-tls\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929118 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-config-out\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929126 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-web-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929134 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929142 2576 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-config\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929151 2576 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/507c80a4-ea62-42d8-a41a-850073f8db0c-prometheus-k8s-db\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929162 2576 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/507c80a4-ea62-42d8-a41a-850073f8db0c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929171 2576 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-metrics-client-certs\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929180 2576 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.929340 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.929191 2576 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/507c80a4-ea62-42d8-a41a-850073f8db0c-secret-kube-rbac-proxy\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:40:33.973029 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.972989 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" exitCode=0 Mar 12 13:40:33.973029 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973021 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" exitCode=0 Mar 12 13:40:33.973029 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973030 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" exitCode=0 Mar 12 13:40:33.973029 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973038 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" exitCode=0 Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973046 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" exitCode=0 Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973053 2576 generic.go:358] "Generic (PLEG): container finished" podID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" exitCode=0 Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973098 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973129 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973138 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"507c80a4-ea62-42d8-a41a-850073f8db0c","Type":"ContainerDied","Data":"ce7adcae58e11af6c6326b178efe7407bb99b268456ab1f23c155dbb4539f614"} Mar 12 13:40:33.973316 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.973179 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:33.982289 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.982259 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:33.990374 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.990354 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:33.998417 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:33.998398 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.007661 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.007474 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:40:34.007761 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.007654 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.009620 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.009596 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:40:34.016019 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.016000 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.024688 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.024665 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.032929 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.032908 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.033220 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.033198 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.033279 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.033230 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} err="failed to get container status \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" Mar 12 13:40:34.033279 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.033251 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.033530 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.033505 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.033530 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.033527 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} err="failed to get container status \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" Mar 12 13:40:34.033658 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.033541 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.033808 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.033791 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.033876 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.033816 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} err="failed to get container status \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" Mar 12 13:40:34.033876 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.033833 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.034075 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.034057 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.034119 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034080 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} err="failed to get container status \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" Mar 12 13:40:34.034119 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034094 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.034307 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.034291 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.034496 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034310 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} err="failed to get container status \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" Mar 12 13:40:34.034496 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034322 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.034620 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.034609 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.034662 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034630 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} err="failed to get container status \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" Mar 12 13:40:34.034662 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034643 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.034848 ip-10-0-142-16 kubenswrapper[2576]: E0312 13:40:34.034832 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.034888 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034852 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} err="failed to get container status \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" Mar 12 13:40:34.034888 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.034866 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.035068 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035052 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} err="failed to get container status \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" Mar 12 13:40:34.035110 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035068 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.035249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035234 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} err="failed to get container status \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" Mar 12 13:40:34.035287 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035249 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.035435 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035416 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} err="failed to get container status \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" Mar 12 13:40:34.035490 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035435 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.035664 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035649 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} err="failed to get container status \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" Mar 12 13:40:34.035705 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035664 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.035873 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035854 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} err="failed to get container status \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" Mar 12 13:40:34.035946 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.035876 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.036088 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036072 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} err="failed to get container status \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" Mar 12 13:40:34.036135 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036089 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.036262 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036248 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} err="failed to get container status \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" Mar 12 13:40:34.036359 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036262 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.036436 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036421 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} err="failed to get container status \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" Mar 12 13:40:34.036518 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036437 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.036697 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036680 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} err="failed to get container status \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" Mar 12 13:40:34.036738 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036698 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.036901 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036877 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} err="failed to get container status \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" Mar 12 13:40:34.036901 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.036900 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.037120 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037100 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} err="failed to get container status \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" Mar 12 13:40:34.037168 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037121 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.037335 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037319 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} err="failed to get container status \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" Mar 12 13:40:34.037335 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037334 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.037567 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037551 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} err="failed to get container status \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" Mar 12 13:40:34.037567 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037567 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.037771 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037751 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} err="failed to get container status \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" Mar 12 13:40:34.037819 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037772 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.037982 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037965 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} err="failed to get container status \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" Mar 12 13:40:34.038021 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.037983 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.038154 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038140 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} err="failed to get container status \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" Mar 12 13:40:34.038192 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038154 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.038304 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038290 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} err="failed to get container status \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" Mar 12 13:40:34.038346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038303 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.038497 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038475 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} err="failed to get container status \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" Mar 12 13:40:34.038549 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038499 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.038730 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038709 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} err="failed to get container status \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" Mar 12 13:40:34.038798 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038732 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.038989 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038972 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} err="failed to get container status \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" Mar 12 13:40:34.039058 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.038991 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.039190 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039169 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} err="failed to get container status \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" Mar 12 13:40:34.039239 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039195 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.039419 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039402 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} err="failed to get container status \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" Mar 12 13:40:34.039511 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039420 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.039681 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039663 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} err="failed to get container status \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" Mar 12 13:40:34.039723 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039682 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.039904 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039887 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} err="failed to get container status \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" Mar 12 13:40:34.039973 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.039905 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.040116 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040099 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} err="failed to get container status \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" Mar 12 13:40:34.040159 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040117 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.040314 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040297 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} err="failed to get container status \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" Mar 12 13:40:34.040381 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040316 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.040561 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040541 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} err="failed to get container status \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" Mar 12 13:40:34.040645 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040561 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.040835 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040815 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} err="failed to get container status \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" Mar 12 13:40:34.040890 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.040835 2576 scope.go:117] "RemoveContainer" containerID="352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78" Mar 12 13:40:34.041029 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041011 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78"} err="failed to get container status \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": rpc error: code = NotFound desc = could not find container \"352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78\": container with ID starting with 352970a7121c5cb870f80b3e15c56bf05f8bae4222f11a8c2a5ba9ee59a44e78 not found: ID does not exist" Mar 12 13:40:34.041073 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041029 2576 scope.go:117] "RemoveContainer" containerID="58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0" Mar 12 13:40:34.041249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041218 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0"} err="failed to get container status \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": rpc error: code = NotFound desc = could not find container \"58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0\": container with ID starting with 58a59418fc3bb8137dfe00cbee2c48a28975d99df23fcd30979cc6663326feb0 not found: ID does not exist" Mar 12 13:40:34.041249 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041244 2576 scope.go:117] "RemoveContainer" containerID="b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754" Mar 12 13:40:34.041446 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041427 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754"} err="failed to get container status \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": rpc error: code = NotFound desc = could not find container \"b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754\": container with ID starting with b975258bc586b985b133d96237cb5adb5d72957069e9cfaa61049865832d7754 not found: ID does not exist" Mar 12 13:40:34.041527 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041448 2576 scope.go:117] "RemoveContainer" containerID="1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a" Mar 12 13:40:34.041654 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041639 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a"} err="failed to get container status \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": rpc error: code = NotFound desc = could not find container \"1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a\": container with ID starting with 1eb1318bde03529e95a548c4626321074c8ab8643d4ac75b4e860239745f831a not found: ID does not exist" Mar 12 13:40:34.041700 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041654 2576 scope.go:117] "RemoveContainer" containerID="450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce" Mar 12 13:40:34.041862 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041833 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce"} err="failed to get container status \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": rpc error: code = NotFound desc = could not find container \"450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce\": container with ID starting with 450f4d49f6618fdd8f479b1b102d292ecc363d986429eb532d8d4eb22aa172ce not found: ID does not exist" Mar 12 13:40:34.041862 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.041851 2576 scope.go:117] "RemoveContainer" containerID="868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f" Mar 12 13:40:34.042077 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.042053 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f"} err="failed to get container status \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": rpc error: code = NotFound desc = could not find container \"868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f\": container with ID starting with 868eb4559cf2b8ff329afe0028ba3b29c7a1a6a850d8b2f0895803f2f755865f not found: ID does not exist" Mar 12 13:40:34.042077 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.042075 2576 scope.go:117] "RemoveContainer" containerID="9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6" Mar 12 13:40:34.042368 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.042348 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6"} err="failed to get container status \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": rpc error: code = NotFound desc = could not find container \"9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6\": container with ID starting with 9fc55385db5d52f47dc3c904a404ec477016c6082c156f2139e444fa46f10ef6 not found: ID does not exist" Mar 12 13:40:34.046557 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046535 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:40:34.046902 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046890 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="config-reloader" Mar 12 13:40:34.046939 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046904 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="config-reloader" Mar 12 13:40:34.046939 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046919 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="init-config-reloader" Mar 12 13:40:34.046939 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="init-config-reloader" Mar 12 13:40:34.046939 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046931 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-web" Mar 12 13:40:34.046939 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046937 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-web" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046945 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="thanos-sidecar" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046951 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="thanos-sidecar" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046958 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-thanos" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046963 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-thanos" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046973 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046978 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046983 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-web" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046988 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-web" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.046995 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="prom-label-proxy" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047001 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="prom-label-proxy" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047008 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="config-reloader" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047012 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="config-reloader" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047017 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-metric" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047022 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-metric" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047028 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047032 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047039 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" containerName="registry" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047044 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" containerName="registry" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047077 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="alertmanager" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047082 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="alertmanager" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047089 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="prometheus" Mar 12 13:40:34.047092 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047095 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="prometheus" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047105 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="init-config-reloader" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047110 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="init-config-reloader" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047117 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="437239e7-0e09-41f9-8d00-4d1271f0db28" containerName="console" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047121 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="437239e7-0e09-41f9-8d00-4d1271f0db28" containerName="console" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047177 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="alertmanager" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047201 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="prom-label-proxy" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047208 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-metric" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047216 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="thanos-sidecar" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047223 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-web" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047230 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="prometheus" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047235 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047241 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047247 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="327ec7eb-a5de-4709-9fbc-6bbe58d87674" containerName="registry" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047252 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="config-reloader" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047258 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" containerName="kube-rbac-proxy-thanos" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047263 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="437239e7-0e09-41f9-8d00-4d1271f0db28" containerName="console" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047268 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="kube-rbac-proxy-web" Mar 12 13:40:34.047741 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.047275 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad5bfa9a-7696-439c-943a-08df7f56f525" containerName="config-reloader" Mar 12 13:40:34.053063 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.053033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.059766 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.059741 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Mar 12 13:40:34.061185 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.061161 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Mar 12 13:40:34.061303 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.061282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Mar 12 13:40:34.061418 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.061396 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gmcz9\"" Mar 12 13:40:34.061548 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.061535 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3fq79u52touef\"" Mar 12 13:40:34.061712 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.061697 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Mar 12 13:40:34.061754 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.061697 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Mar 12 13:40:34.062048 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.062036 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Mar 12 13:40:34.063945 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.063930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Mar 12 13:40:34.064129 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.064116 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Mar 12 13:40:34.064661 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.064647 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Mar 12 13:40:34.064736 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.064721 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Mar 12 13:40:34.068697 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.068666 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Mar 12 13:40:34.069588 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.069567 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:40:34.073073 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.073053 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Mar 12 13:40:34.130994 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.130960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-config\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.130994 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.130998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46214ae2-1cfa-409e-822c-adac5e4d0f1c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-web-config\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46214ae2-1cfa-409e-822c-adac5e4d0f1c-config-out\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131252 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131254 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131667 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131667 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4ps\" (UniqueName: \"kubernetes.io/projected/46214ae2-1cfa-409e-822c-adac5e4d0f1c-kube-api-access-zr4ps\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131667 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131667 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131667 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.131667 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.131493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232110 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-config\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232110 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46214ae2-1cfa-409e-822c-adac5e4d0f1c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-web-config\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232346 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232324 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232591 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232591 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46214ae2-1cfa-409e-822c-adac5e4d0f1c-config-out\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232591 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232591 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232591 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232591 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4ps\" (UniqueName: \"kubernetes.io/projected/46214ae2-1cfa-409e-822c-adac5e4d0f1c-kube-api-access-zr4ps\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.232885 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.232841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.233245 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.233052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.233300 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.233262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.234287 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.234258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.235630 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.235522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-web-config\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.235630 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.235522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-config\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.235840 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.235802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.235840 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.235802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.235947 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.235895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/46214ae2-1cfa-409e-822c-adac5e4d0f1c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.236027 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.236004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.236223 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.236203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.236702 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.236682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.238214 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.238187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/46214ae2-1cfa-409e-822c-adac5e4d0f1c-config-out\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.238375 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.238354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.238549 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.238530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/46214ae2-1cfa-409e-822c-adac5e4d0f1c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.238621 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.238583 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.238694 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.238675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/46214ae2-1cfa-409e-822c-adac5e4d0f1c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.244623 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.244567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4ps\" (UniqueName: \"kubernetes.io/projected/46214ae2-1cfa-409e-822c-adac5e4d0f1c-kube-api-access-zr4ps\") pod \"prometheus-k8s-0\" (UID: \"46214ae2-1cfa-409e-822c-adac5e4d0f1c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.363903 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.363862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:40:34.501712 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.501503 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 13:40:34.504882 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:40:34.504849 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46214ae2_1cfa_409e_822c_adac5e4d0f1c.slice/crio-5011679bf957c181e3866d9900f598f482e0d1bd10ef54f177f499d981fad5f6 WatchSource:0}: Error finding container 5011679bf957c181e3866d9900f598f482e0d1bd10ef54f177f499d981fad5f6: Status 404 returned error can't find the container with id 5011679bf957c181e3866d9900f598f482e0d1bd10ef54f177f499d981fad5f6 Mar 12 13:40:34.977955 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.977913 2576 generic.go:358] "Generic (PLEG): container finished" podID="46214ae2-1cfa-409e-822c-adac5e4d0f1c" containerID="755a52bdd6b00ffd58d5e01b42342c3b2ff90825a8991f4ad8ca2f63c424a4d6" exitCode=0 Mar 12 13:40:34.978162 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.977996 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerDied","Data":"755a52bdd6b00ffd58d5e01b42342c3b2ff90825a8991f4ad8ca2f63c424a4d6"} Mar 12 13:40:34.978162 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:34.978030 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"5011679bf957c181e3866d9900f598f482e0d1bd10ef54f177f499d981fad5f6"} Mar 12 13:40:35.087851 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.087817 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="507c80a4-ea62-42d8-a41a-850073f8db0c" path="/var/lib/kubelet/pods/507c80a4-ea62-42d8-a41a-850073f8db0c/volumes" Mar 12 13:40:35.985113 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.985079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"45927055b59c64b6b66055bbf19cd2bc22dc46b4e6002b888fcc07183d0e0805"} Mar 12 13:40:35.985113 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.985114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"2a6a44d797f3e2226cae162be67eb8d5e58085f9692f8d3fede539b30737eafb"} Mar 12 13:40:35.985608 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.985128 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"03647f6583634ddb9b97157566feb0d23248ef5f561ff144e1b43eba296f0228"} Mar 12 13:40:35.985608 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.985140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"0c9b7053ee4254b34a1f6d431f4805da621e0e01f8d95db8fbdd1ec709e15b9b"} Mar 12 13:40:35.985608 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.985148 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"69c2d5f67ccccfde13dd4d854c44d9cc32bebb0c65cb959b7a33c1182d590927"} Mar 12 13:40:35.985608 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:35.985156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"46214ae2-1cfa-409e-822c-adac5e4d0f1c","Type":"ContainerStarted","Data":"8a8f0d705443761e5b10aa5ccfc9c75a7c295f2ec3e4edd3711360f5149ef442"} Mar 12 13:40:36.016210 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:36.016147 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.016128114 podStartE2EDuration="2.016128114s" podCreationTimestamp="2026-03-12 13:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:40:36.013309245 +0000 UTC m=+171.525243273" watchObservedRunningTime="2026-03-12 13:40:36.016128114 +0000 UTC m=+171.528062159" Mar 12 13:40:39.364587 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:40:39.364539 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:41:34.365078 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:41:34.365034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:41:34.382430 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:41:34.382398 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:41:35.208637 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:41:35.208606 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 13:42:45.013776 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:42:45.013746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:42:45.014378 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:42:45.013746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:42:45.025887 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:42:45.025863 2576 kubelet.go:1628] "Image garbage collection succeeded" Mar 12 13:47:45.041372 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:45.041340 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:47:45.042453 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:45.042434 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:47:48.842217 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.842181 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p"] Mar 12 13:47:48.845704 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.845685 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.848432 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.848400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Mar 12 13:47:48.849633 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.849606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Mar 12 13:47:48.849760 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.849662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-kphx5\"" Mar 12 13:47:48.849760 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.849611 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Mar 12 13:47:48.849760 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.849611 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Mar 12 13:47:48.855784 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.855759 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p"] Mar 12 13:47:48.894515 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.894480 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6lpr\" (UniqueName: \"kubernetes.io/projected/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-kube-api-access-b6lpr\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.894691 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.894528 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.894691 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.894618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-cert\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.995820 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.995774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6lpr\" (UniqueName: \"kubernetes.io/projected/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-kube-api-access-b6lpr\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.995996 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.995831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.995996 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.995866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-cert\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.996725 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.996629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:48.998584 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:48.998553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-cert\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:49.005756 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:49.005722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6lpr\" (UniqueName: \"kubernetes.io/projected/2cbd7640-6fcf-4b92-a260-6de0c73e1b78-kube-api-access-b6lpr\") pod \"kubeflow-trainer-controller-manager-785785fffb-8rn7p\" (UID: \"2cbd7640-6fcf-4b92-a260-6de0c73e1b78\") " pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:49.170824 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:49.170777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:49.306880 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:49.306653 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p"] Mar 12 13:47:49.311275 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:47:49.311237 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cbd7640_6fcf_4b92_a260_6de0c73e1b78.slice/crio-4e582de32d22e03cbbdaeb947ab0bb0ec7e20a9fa36f7b963312dc48a53b2508 WatchSource:0}: Error finding container 4e582de32d22e03cbbdaeb947ab0bb0ec7e20a9fa36f7b963312dc48a53b2508: Status 404 returned error can't find the container with id 4e582de32d22e03cbbdaeb947ab0bb0ec7e20a9fa36f7b963312dc48a53b2508 Mar 12 13:47:49.313328 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:49.313309 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:47:49.500709 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:49.500620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" event={"ID":"2cbd7640-6fcf-4b92-a260-6de0c73e1b78","Type":"ContainerStarted","Data":"4e582de32d22e03cbbdaeb947ab0bb0ec7e20a9fa36f7b963312dc48a53b2508"} Mar 12 13:47:52.515416 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:52.515379 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" event={"ID":"2cbd7640-6fcf-4b92-a260-6de0c73e1b78","Type":"ContainerStarted","Data":"b0d76f7f16463dc7e6e217cd0fad8cee17edef859071f3a71612f129e510b1b3"} Mar 12 13:47:52.515873 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:52.515522 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:47:52.532922 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:47:52.532864 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" podStartSLOduration=2.056182135 podStartE2EDuration="4.532843318s" podCreationTimestamp="2026-03-12 13:47:48 +0000 UTC" firstStartedPulling="2026-03-12 13:47:49.313512032 +0000 UTC m=+604.825446037" lastFinishedPulling="2026-03-12 13:47:51.790173212 +0000 UTC m=+607.302107220" observedRunningTime="2026-03-12 13:47:52.532395885 +0000 UTC m=+608.044329913" watchObservedRunningTime="2026-03-12 13:47:52.532843318 +0000 UTC m=+608.044777346" Mar 12 13:48:08.525635 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:48:08.525545 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-785785fffb-8rn7p" Mar 12 13:49:54.149634 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.149552 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch"] Mar 12 13:49:54.153076 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.153056 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:49:54.156805 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.156777 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 13:49:54.156969 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.156778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 13:49:54.165060 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.165038 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 13:49:54.206731 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.206692 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch"] Mar 12 13:49:54.252448 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.252410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh4cp\" (UniqueName: \"kubernetes.io/projected/e342fb05-ff4e-4093-958f-ff1a18a0af01-kube-api-access-kh4cp\") pod \"progression-enabled-node-0-0-rs8ch\" (UID: \"e342fb05-ff4e-4093-958f-ff1a18a0af01\") " pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:49:54.353857 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.353815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kh4cp\" (UniqueName: \"kubernetes.io/projected/e342fb05-ff4e-4093-958f-ff1a18a0af01-kube-api-access-kh4cp\") pod \"progression-enabled-node-0-0-rs8ch\" (UID: \"e342fb05-ff4e-4093-958f-ff1a18a0af01\") " pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:49:54.363555 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.363528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh4cp\" (UniqueName: \"kubernetes.io/projected/e342fb05-ff4e-4093-958f-ff1a18a0af01-kube-api-access-kh4cp\") pod \"progression-enabled-node-0-0-rs8ch\" (UID: \"e342fb05-ff4e-4093-958f-ff1a18a0af01\") " pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:49:54.463952 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.463823 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:49:54.596424 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.596396 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch"] Mar 12 13:49:54.598564 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:49:54.598527 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode342fb05_ff4e_4093_958f_ff1a18a0af01.slice/crio-9afec3f3e88348650ad1aa56474935360f3704b1a3f7a7ea0b353838a44706ce WatchSource:0}: Error finding container 9afec3f3e88348650ad1aa56474935360f3704b1a3f7a7ea0b353838a44706ce: Status 404 returned error can't find the container with id 9afec3f3e88348650ad1aa56474935360f3704b1a3f7a7ea0b353838a44706ce Mar 12 13:49:54.952536 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:49:54.952503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" event={"ID":"e342fb05-ff4e-4093-958f-ff1a18a0af01","Type":"ContainerStarted","Data":"9afec3f3e88348650ad1aa56474935360f3704b1a3f7a7ea0b353838a44706ce"} Mar 12 13:51:42.372866 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:42.372825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" event={"ID":"e342fb05-ff4e-4093-958f-ff1a18a0af01","Type":"ContainerStarted","Data":"1f68251bf82f4d24733f858b4ea119abc9389291f699d67ccf2dbe8ae6eb2831"} Mar 12 13:51:42.373322 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:42.372933 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:51:42.407831 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:42.407773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" podStartSLOduration=1.237605643 podStartE2EDuration="1m48.407755349s" podCreationTimestamp="2026-03-12 13:49:54 +0000 UTC" firstStartedPulling="2026-03-12 13:49:54.600928321 +0000 UTC m=+730.112862325" lastFinishedPulling="2026-03-12 13:51:41.771078011 +0000 UTC m=+837.283012031" observedRunningTime="2026-03-12 13:51:42.405703654 +0000 UTC m=+837.917637694" watchObservedRunningTime="2026-03-12 13:51:42.407755349 +0000 UTC m=+837.919689376" Mar 12 13:51:43.375052 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:43.375008 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" probeResult="failure" output="Get \"http://10.132.0.34:28080/metrics\": dial tcp 10.132.0.34:28080: connect: connection refused" Mar 12 13:51:43.376833 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:43.376794 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" probeResult="failure" output="Get \"http://10.132.0.34:28080/metrics\": dial tcp 10.132.0.34:28080: connect: connection refused" Mar 12 13:51:44.379145 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:44.379108 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:51:58.377063 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:58.377017 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" probeResult="failure" output="Get \"http://10.132.0.34:28080/metrics\": dial tcp 10.132.0.34:28080: connect: connection refused" Mar 12 13:51:59.377146 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:59.377095 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" probeResult="failure" output="Get \"http://10.132.0.34:28080/metrics\": dial tcp 10.132.0.34:28080: connect: connection refused" Mar 12 13:51:59.377672 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:59.377208 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:51:59.377672 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:59.377655 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" probeResult="failure" output="Get \"http://10.132.0.34:28080/metrics\": dial tcp 10.132.0.34:28080: connect: connection refused" Mar 12 13:51:59.435307 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:59.435265 2576 generic.go:358] "Generic (PLEG): container finished" podID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerID="1f68251bf82f4d24733f858b4ea119abc9389291f699d67ccf2dbe8ae6eb2831" exitCode=0 Mar 12 13:51:59.435520 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:51:59.435336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" event={"ID":"e342fb05-ff4e-4093-958f-ff1a18a0af01","Type":"ContainerDied","Data":"1f68251bf82f4d24733f858b4ea119abc9389291f699d67ccf2dbe8ae6eb2831"} Mar 12 13:52:00.574288 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:00.574260 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:52:00.675870 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:00.675837 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh4cp\" (UniqueName: \"kubernetes.io/projected/e342fb05-ff4e-4093-958f-ff1a18a0af01-kube-api-access-kh4cp\") pod \"e342fb05-ff4e-4093-958f-ff1a18a0af01\" (UID: \"e342fb05-ff4e-4093-958f-ff1a18a0af01\") " Mar 12 13:52:00.678415 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:00.678384 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e342fb05-ff4e-4093-958f-ff1a18a0af01-kube-api-access-kh4cp" (OuterVolumeSpecName: "kube-api-access-kh4cp") pod "e342fb05-ff4e-4093-958f-ff1a18a0af01" (UID: "e342fb05-ff4e-4093-958f-ff1a18a0af01"). InnerVolumeSpecName "kube-api-access-kh4cp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:52:00.777526 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:00.777424 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kh4cp\" (UniqueName: \"kubernetes.io/projected/e342fb05-ff4e-4093-958f-ff1a18a0af01-kube-api-access-kh4cp\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:52:01.450896 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:01.450858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" event={"ID":"e342fb05-ff4e-4093-958f-ff1a18a0af01","Type":"ContainerDied","Data":"9afec3f3e88348650ad1aa56474935360f3704b1a3f7a7ea0b353838a44706ce"} Mar 12 13:52:01.450896 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:01.450900 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9afec3f3e88348650ad1aa56474935360f3704b1a3f7a7ea0b353838a44706ce" Mar 12 13:52:01.451103 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:01.450912 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch" Mar 12 13:52:45.068593 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:45.068508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:52:45.070875 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:52:45.070853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:56:44.485010 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.484968 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c"] Mar 12 13:56:44.487440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.485391 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" Mar 12 13:56:44.487440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.485403 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" Mar 12 13:56:44.487440 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.485510 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" containerName="node" Mar 12 13:56:44.488489 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.488445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:56:44.491476 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.491436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 13:56:44.491631 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.491581 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 13:56:44.493000 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.492978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 13:56:44.505298 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.505265 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c"] Mar 12 13:56:44.564564 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.564444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mddbt\" (UniqueName: \"kubernetes.io/projected/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d-kube-api-access-mddbt\") pod \"progression-disabled-node-0-0-44x4c\" (UID: \"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d\") " pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:56:44.665221 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.665170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mddbt\" (UniqueName: \"kubernetes.io/projected/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d-kube-api-access-mddbt\") pod \"progression-disabled-node-0-0-44x4c\" (UID: \"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d\") " pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:56:44.674991 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.674965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mddbt\" (UniqueName: \"kubernetes.io/projected/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d-kube-api-access-mddbt\") pod \"progression-disabled-node-0-0-44x4c\" (UID: \"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d\") " pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:56:44.799588 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.799485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:56:44.931076 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.931047 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c"] Mar 12 13:56:44.933689 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:56:44.933652 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5dbaac5_b359_4bfa_bf6a_98abf9f7cb2d.slice/crio-916a3c250128ffabfa6414134e5da1f2b847a64bfaa05d6e67463687f75dd1b0 WatchSource:0}: Error finding container 916a3c250128ffabfa6414134e5da1f2b847a64bfaa05d6e67463687f75dd1b0: Status 404 returned error can't find the container with id 916a3c250128ffabfa6414134e5da1f2b847a64bfaa05d6e67463687f75dd1b0 Mar 12 13:56:44.935747 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:44.935725 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 13:56:45.463167 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:45.463134 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" event={"ID":"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d","Type":"ContainerStarted","Data":"21dbcb5856ad693c0bd105cbb7a763c83a31d78d44a3e0a6db70cc8c1e832f9d"} Mar 12 13:56:45.463167 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:45.463169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" event={"ID":"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d","Type":"ContainerStarted","Data":"916a3c250128ffabfa6414134e5da1f2b847a64bfaa05d6e67463687f75dd1b0"} Mar 12 13:56:45.463445 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:45.463206 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:56:45.483970 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:45.483915 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" podStartSLOduration=1.483899111 podStartE2EDuration="1.483899111s" podCreationTimestamp="2026-03-12 13:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:56:45.481789866 +0000 UTC m=+1140.993723893" watchObservedRunningTime="2026-03-12 13:56:45.483899111 +0000 UTC m=+1140.995833138" Mar 12 13:56:47.469893 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:56:47.469863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:57:01.467791 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:01.467745 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" podUID="c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" containerName="node" probeResult="failure" output="Get \"http://10.132.0.35:28080/metrics\": dial tcp 10.132.0.35:28080: connect: connection refused" Mar 12 13:57:01.523852 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:01.523758 2576 generic.go:358] "Generic (PLEG): container finished" podID="c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" containerID="21dbcb5856ad693c0bd105cbb7a763c83a31d78d44a3e0a6db70cc8c1e832f9d" exitCode=0 Mar 12 13:57:01.523852 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:01.523834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" event={"ID":"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d","Type":"ContainerDied","Data":"21dbcb5856ad693c0bd105cbb7a763c83a31d78d44a3e0a6db70cc8c1e832f9d"} Mar 12 13:57:02.667606 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:02.667574 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:57:02.830573 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:02.830490 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mddbt\" (UniqueName: \"kubernetes.io/projected/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d-kube-api-access-mddbt\") pod \"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d\" (UID: \"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d\") " Mar 12 13:57:02.832847 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:02.832815 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d-kube-api-access-mddbt" (OuterVolumeSpecName: "kube-api-access-mddbt") pod "c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" (UID: "c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d"). InnerVolumeSpecName "kube-api-access-mddbt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:57:02.932002 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:02.931955 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mddbt\" (UniqueName: \"kubernetes.io/projected/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d-kube-api-access-mddbt\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:57:03.532216 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:03.532190 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" Mar 12 13:57:03.532391 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:03.532189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c" event={"ID":"c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d","Type":"ContainerDied","Data":"916a3c250128ffabfa6414134e5da1f2b847a64bfaa05d6e67463687f75dd1b0"} Mar 12 13:57:03.532391 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:03.532295 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="916a3c250128ffabfa6414134e5da1f2b847a64bfaa05d6e67463687f75dd1b0" Mar 12 13:57:19.458489 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.458418 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt"] Mar 12 13:57:19.458967 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.458848 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" containerName="node" Mar 12 13:57:19.458967 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.458860 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" containerName="node" Mar 12 13:57:19.458967 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.458951 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" containerName="node" Mar 12 13:57:19.461761 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.461741 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:19.464543 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.464516 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 13:57:19.464690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.464553 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 13:57:19.464690 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.464578 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 13:57:19.470444 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.470415 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt"] Mar 12 13:57:19.481415 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.481374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stdx7\" (UniqueName: \"kubernetes.io/projected/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11-kube-api-access-stdx7\") pod \"progression-invalid-node-0-0-4kczt\" (UID: \"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11\") " pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:19.582518 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.582445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stdx7\" (UniqueName: \"kubernetes.io/projected/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11-kube-api-access-stdx7\") pod \"progression-invalid-node-0-0-4kczt\" (UID: \"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11\") " pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:19.592179 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.592149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stdx7\" (UniqueName: \"kubernetes.io/projected/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11-kube-api-access-stdx7\") pod \"progression-invalid-node-0-0-4kczt\" (UID: \"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11\") " pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:19.772680 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.772570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:19.905831 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:19.905802 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt"] Mar 12 13:57:19.908049 ip-10-0-142-16 kubenswrapper[2576]: W0312 13:57:19.908018 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc74be50_d2f5_4b6e_959f_1b2dff5d5e11.slice/crio-40022d693fce95c8a627e8d600ab1dfbe6abaada4ed7581c59c6fa49efdef8c5 WatchSource:0}: Error finding container 40022d693fce95c8a627e8d600ab1dfbe6abaada4ed7581c59c6fa49efdef8c5: Status 404 returned error can't find the container with id 40022d693fce95c8a627e8d600ab1dfbe6abaada4ed7581c59c6fa49efdef8c5 Mar 12 13:57:20.599241 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:20.599199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" event={"ID":"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11","Type":"ContainerStarted","Data":"b3b9cbddbe653e90c8a75578bbf44845a16c9643eaddeee5d416c21c6633d737"} Mar 12 13:57:20.599241 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:20.599248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" event={"ID":"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11","Type":"ContainerStarted","Data":"40022d693fce95c8a627e8d600ab1dfbe6abaada4ed7581c59c6fa49efdef8c5"} Mar 12 13:57:20.599800 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:20.599567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:20.626701 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:20.626625 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" podStartSLOduration=1.6265984919999998 podStartE2EDuration="1.626598492s" podCreationTimestamp="2026-03-12 13:57:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 13:57:20.620604047 +0000 UTC m=+1176.132538110" watchObservedRunningTime="2026-03-12 13:57:20.626598492 +0000 UTC m=+1176.138532527" Mar 12 13:57:21.602373 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:21.602343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:36.600333 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:36.600235 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" podUID="fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" containerName="node" probeResult="failure" output="Get \"http://10.132.0.36:28080/metrics\": dial tcp 10.132.0.36:28080: connect: connection refused" Mar 12 13:57:36.666112 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:36.666077 2576 generic.go:358] "Generic (PLEG): container finished" podID="fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" containerID="b3b9cbddbe653e90c8a75578bbf44845a16c9643eaddeee5d416c21c6633d737" exitCode=0 Mar 12 13:57:36.666290 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:36.666153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" event={"ID":"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11","Type":"ContainerDied","Data":"b3b9cbddbe653e90c8a75578bbf44845a16c9643eaddeee5d416c21c6633d737"} Mar 12 13:57:37.800958 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:37.800934 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:37.854258 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:37.854215 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stdx7\" (UniqueName: \"kubernetes.io/projected/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11-kube-api-access-stdx7\") pod \"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11\" (UID: \"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11\") " Mar 12 13:57:37.856563 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:37.856524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11-kube-api-access-stdx7" (OuterVolumeSpecName: "kube-api-access-stdx7") pod "fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" (UID: "fc74be50-d2f5-4b6e-959f-1b2dff5d5e11"). InnerVolumeSpecName "kube-api-access-stdx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 13:57:37.955490 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:37.955379 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stdx7\" (UniqueName: \"kubernetes.io/projected/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11-kube-api-access-stdx7\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 13:57:38.674999 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:38.674967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" Mar 12 13:57:38.675207 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:38.674958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt" event={"ID":"fc74be50-d2f5-4b6e-959f-1b2dff5d5e11","Type":"ContainerDied","Data":"40022d693fce95c8a627e8d600ab1dfbe6abaada4ed7581c59c6fa49efdef8c5"} Mar 12 13:57:38.675207 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:38.675078 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40022d693fce95c8a627e8d600ab1dfbe6abaada4ed7581c59c6fa49efdef8c5" Mar 12 13:57:45.097565 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:45.097533 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 13:57:45.101900 ip-10-0-142-16 kubenswrapper[2576]: I0312 13:57:45.101876 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 14:02:45.124114 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:02:45.124084 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 14:02:45.135755 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:02:45.135732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 14:04:25.099902 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.099867 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94"] Mar 12 14:04:25.100478 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.100299 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" containerName="node" Mar 12 14:04:25.100478 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.100312 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" containerName="node" Mar 12 14:04:25.100478 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.100383 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" containerName="node" Mar 12 14:04:25.103497 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.103479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:04:25.106528 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.106496 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 14:04:25.107754 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.107735 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 14:04:25.107859 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.107764 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 14:04:25.123532 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.123494 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94"] Mar 12 14:04:25.158757 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.158717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4fhd\" (UniqueName: \"kubernetes.io/projected/c3b3b793-6c86-491f-aa04-b70a500a1750-kube-api-access-r4fhd\") pod \"progression-no-metrics-node-0-0-dbc94\" (UID: \"c3b3b793-6c86-491f-aa04-b70a500a1750\") " pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:04:25.259832 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.259789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4fhd\" (UniqueName: \"kubernetes.io/projected/c3b3b793-6c86-491f-aa04-b70a500a1750-kube-api-access-r4fhd\") pod \"progression-no-metrics-node-0-0-dbc94\" (UID: \"c3b3b793-6c86-491f-aa04-b70a500a1750\") " pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:04:25.269547 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.269513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4fhd\" (UniqueName: \"kubernetes.io/projected/c3b3b793-6c86-491f-aa04-b70a500a1750-kube-api-access-r4fhd\") pod \"progression-no-metrics-node-0-0-dbc94\" (UID: \"c3b3b793-6c86-491f-aa04-b70a500a1750\") " pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:04:25.413218 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.413172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:04:25.553719 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.553683 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94"] Mar 12 14:04:25.556369 ip-10-0-142-16 kubenswrapper[2576]: W0312 14:04:25.556335 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3b3b793_6c86_491f_aa04_b70a500a1750.slice/crio-51b9b6526b7466856692a59dc7f46c08fde07aaf46bb2b75f2874d1c17e1f4b5 WatchSource:0}: Error finding container 51b9b6526b7466856692a59dc7f46c08fde07aaf46bb2b75f2874d1c17e1f4b5: Status 404 returned error can't find the container with id 51b9b6526b7466856692a59dc7f46c08fde07aaf46bb2b75f2874d1c17e1f4b5 Mar 12 14:04:25.558443 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:25.558424 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:04:26.157693 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:26.157655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" event={"ID":"c3b3b793-6c86-491f-aa04-b70a500a1750","Type":"ContainerStarted","Data":"d74a7d998a75fd0414d3e158075e266d4c3d881baf0da93cf7aa7e03312d89ce"} Mar 12 14:04:26.157693 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:26.157690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" event={"ID":"c3b3b793-6c86-491f-aa04-b70a500a1750","Type":"ContainerStarted","Data":"51b9b6526b7466856692a59dc7f46c08fde07aaf46bb2b75f2874d1c17e1f4b5"} Mar 12 14:04:26.179240 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:26.179189 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" podStartSLOduration=1.179170547 podStartE2EDuration="1.179170547s" podCreationTimestamp="2026-03-12 14:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:04:26.177854657 +0000 UTC m=+1601.689788684" watchObservedRunningTime="2026-03-12 14:04:26.179170547 +0000 UTC m=+1601.691104574" Mar 12 14:04:31.177056 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:31.177022 2576 generic.go:358] "Generic (PLEG): container finished" podID="c3b3b793-6c86-491f-aa04-b70a500a1750" containerID="d74a7d998a75fd0414d3e158075e266d4c3d881baf0da93cf7aa7e03312d89ce" exitCode=0 Mar 12 14:04:31.177490 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:31.177098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" event={"ID":"c3b3b793-6c86-491f-aa04-b70a500a1750","Type":"ContainerDied","Data":"d74a7d998a75fd0414d3e158075e266d4c3d881baf0da93cf7aa7e03312d89ce"} Mar 12 14:04:32.312539 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:32.312510 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:04:32.423451 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:32.423387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4fhd\" (UniqueName: \"kubernetes.io/projected/c3b3b793-6c86-491f-aa04-b70a500a1750-kube-api-access-r4fhd\") pod \"c3b3b793-6c86-491f-aa04-b70a500a1750\" (UID: \"c3b3b793-6c86-491f-aa04-b70a500a1750\") " Mar 12 14:04:32.425720 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:32.425689 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b3b793-6c86-491f-aa04-b70a500a1750-kube-api-access-r4fhd" (OuterVolumeSpecName: "kube-api-access-r4fhd") pod "c3b3b793-6c86-491f-aa04-b70a500a1750" (UID: "c3b3b793-6c86-491f-aa04-b70a500a1750"). InnerVolumeSpecName "kube-api-access-r4fhd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 14:04:32.524596 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:32.524545 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4fhd\" (UniqueName: \"kubernetes.io/projected/c3b3b793-6c86-491f-aa04-b70a500a1750-kube-api-access-r4fhd\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 14:04:33.184950 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:33.184913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" event={"ID":"c3b3b793-6c86-491f-aa04-b70a500a1750","Type":"ContainerDied","Data":"51b9b6526b7466856692a59dc7f46c08fde07aaf46bb2b75f2874d1c17e1f4b5"} Mar 12 14:04:33.184950 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:33.184948 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51b9b6526b7466856692a59dc7f46c08fde07aaf46bb2b75f2874d1c17e1f4b5" Mar 12 14:04:33.184950 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:04:33.184947 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94" Mar 12 14:07:45.154546 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:07:45.154516 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 14:07:45.165121 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:07:45.165095 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 14:09:37.232508 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.232402 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts"] Mar 12 14:09:37.232986 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.232808 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3b3b793-6c86-491f-aa04-b70a500a1750" containerName="node" Mar 12 14:09:37.232986 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.232819 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b3b793-6c86-491f-aa04-b70a500a1750" containerName="node" Mar 12 14:09:37.232986 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.232888 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3b3b793-6c86-491f-aa04-b70a500a1750" containerName="node" Mar 12 14:09:37.236086 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.236067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:37.238968 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.238942 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-nlrws\"/\"default-dockercfg-bn65w\"" Mar 12 14:09:37.238968 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.238950 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"kube-root-ca.crt\"" Mar 12 14:09:37.240137 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.240118 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-nlrws\"/\"openshift-service-ca.crt\"" Mar 12 14:09:37.255201 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.255169 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts"] Mar 12 14:09:37.324987 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.324949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7ghw\" (UniqueName: \"kubernetes.io/projected/7370775c-2b21-4616-a050-8079030af7c1-kube-api-access-w7ghw\") pod \"progression-prestop-hook-node-0-0-bh2ts\" (UID: \"7370775c-2b21-4616-a050-8079030af7c1\") " pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:37.426234 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.426195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7ghw\" (UniqueName: \"kubernetes.io/projected/7370775c-2b21-4616-a050-8079030af7c1-kube-api-access-w7ghw\") pod \"progression-prestop-hook-node-0-0-bh2ts\" (UID: \"7370775c-2b21-4616-a050-8079030af7c1\") " pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:37.435389 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.435354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7ghw\" (UniqueName: \"kubernetes.io/projected/7370775c-2b21-4616-a050-8079030af7c1-kube-api-access-w7ghw\") pod \"progression-prestop-hook-node-0-0-bh2ts\" (UID: \"7370775c-2b21-4616-a050-8079030af7c1\") " pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:37.546079 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.545984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:37.676548 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.676518 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts"] Mar 12 14:09:37.679366 ip-10-0-142-16 kubenswrapper[2576]: W0312 14:09:37.679340 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7370775c_2b21_4616_a050_8079030af7c1.slice/crio-039f632824fc79d9f83a0a68da0c745c0944da5b8a6c43246e306a46e7002d81 WatchSource:0}: Error finding container 039f632824fc79d9f83a0a68da0c745c0944da5b8a6c43246e306a46e7002d81: Status 404 returned error can't find the container with id 039f632824fc79d9f83a0a68da0c745c0944da5b8a6c43246e306a46e7002d81 Mar 12 14:09:37.681421 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:37.681402 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 14:09:38.349589 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:38.349540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" event={"ID":"7370775c-2b21-4616-a050-8079030af7c1","Type":"ContainerStarted","Data":"b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53"} Mar 12 14:09:38.350057 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:38.349600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" event={"ID":"7370775c-2b21-4616-a050-8079030af7c1","Type":"ContainerStarted","Data":"039f632824fc79d9f83a0a68da0c745c0944da5b8a6c43246e306a46e7002d81"} Mar 12 14:09:38.350442 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:38.350416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:39.352960 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:39.352921 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" podUID="7370775c-2b21-4616-a050-8079030af7c1" containerName="node" probeResult="failure" output="Get \"http://10.132.0.38:28080/metrics\": dial tcp 10.132.0.38:28080: connect: connection refused" Mar 12 14:09:39.353626 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:39.353598 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" podUID="7370775c-2b21-4616-a050-8079030af7c1" containerName="node" probeResult="failure" output="Get \"http://10.132.0.38:28080/metrics\": dial tcp 10.132.0.38:28080: connect: connection refused" Mar 12 14:09:40.355203 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:40.355168 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:40.377306 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:40.377251 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" podStartSLOduration=3.377235892 podStartE2EDuration="3.377235892s" podCreationTimestamp="2026-03-12 14:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:09:38.371415888 +0000 UTC m=+1913.883349921" watchObservedRunningTime="2026-03-12 14:09:40.377235892 +0000 UTC m=+1915.889169978" Mar 12 14:09:41.564276 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.564235 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45"] Mar 12 14:09:41.567592 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.567571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:41.576443 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.576412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45"] Mar 12 14:09:41.663825 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.663787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whplh\" (UniqueName: \"kubernetes.io/projected/8d660fa5-e41a-45d3-8a04-cd0473ef344b-kube-api-access-whplh\") pod \"progression-multi-container-node-0-0-bbk45\" (UID: \"8d660fa5-e41a-45d3-8a04-cd0473ef344b\") " pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:41.764346 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.764309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whplh\" (UniqueName: \"kubernetes.io/projected/8d660fa5-e41a-45d3-8a04-cd0473ef344b-kube-api-access-whplh\") pod \"progression-multi-container-node-0-0-bbk45\" (UID: \"8d660fa5-e41a-45d3-8a04-cd0473ef344b\") " pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:41.773112 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.773076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whplh\" (UniqueName: \"kubernetes.io/projected/8d660fa5-e41a-45d3-8a04-cd0473ef344b-kube-api-access-whplh\") pod \"progression-multi-container-node-0-0-bbk45\" (UID: \"8d660fa5-e41a-45d3-8a04-cd0473ef344b\") " pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:41.879315 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:41.879223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:42.008594 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:42.008565 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45"] Mar 12 14:09:42.010916 ip-10-0-142-16 kubenswrapper[2576]: W0312 14:09:42.010883 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d660fa5_e41a_45d3_8a04_cd0473ef344b.slice/crio-ce176035908bc513810a7e678c2f191a7a899e680a135ea8670539939b47b5fe WatchSource:0}: Error finding container ce176035908bc513810a7e678c2f191a7a899e680a135ea8670539939b47b5fe: Status 404 returned error can't find the container with id ce176035908bc513810a7e678c2f191a7a899e680a135ea8670539939b47b5fe Mar 12 14:09:42.364655 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:42.364617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" event={"ID":"8d660fa5-e41a-45d3-8a04-cd0473ef344b","Type":"ContainerStarted","Data":"d539fa26bf94cbc61db986041c9a8a5196a52f5184f0919b1133daf782880069"} Mar 12 14:09:42.364655 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:42.364661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" event={"ID":"8d660fa5-e41a-45d3-8a04-cd0473ef344b","Type":"ContainerStarted","Data":"ce176035908bc513810a7e678c2f191a7a899e680a135ea8670539939b47b5fe"} Mar 12 14:09:42.364982 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:42.364680 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:42.381857 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:42.381790 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" podStartSLOduration=1.381767558 podStartE2EDuration="1.381767558s" podCreationTimestamp="2026-03-12 14:09:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:09:42.380418302 +0000 UTC m=+1917.892352329" watchObservedRunningTime="2026-03-12 14:09:42.381767558 +0000 UTC m=+1917.893701606" Mar 12 14:09:43.552586 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:43.552558 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-785785fffb-8rn7p_2cbd7640-6fcf-4b92-a260-6de0c73e1b78/manager/0.log" Mar 12 14:09:44.371587 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:44.371557 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:48.460727 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.460688 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c"] Mar 12 14:09:48.466072 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.466038 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-disabled-node-0-0-44x4c"] Mar 12 14:09:48.471274 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.471220 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch"] Mar 12 14:09:48.473838 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.473817 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-enabled-node-0-0-rs8ch"] Mar 12 14:09:48.478043 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.478010 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt"] Mar 12 14:09:48.482157 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.482135 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-invalid-node-0-0-4kczt"] Mar 12 14:09:48.494219 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.494196 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45"] Mar 12 14:09:48.494439 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.494401 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" podUID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" containerName="node" containerID="cri-o://d539fa26bf94cbc61db986041c9a8a5196a52f5184f0919b1133daf782880069" gracePeriod=30 Mar 12 14:09:48.501022 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.500995 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94"] Mar 12 14:09:48.508913 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.508886 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-no-metrics-node-0-0-dbc94"] Mar 12 14:09:48.513895 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.513864 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts"] Mar 12 14:09:48.514116 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:48.514092 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" podUID="7370775c-2b21-4616-a050-8079030af7c1" containerName="node" containerID="cri-o://b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53" gracePeriod=30 Mar 12 14:09:49.087033 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:49.086995 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b3b793-6c86-491f-aa04-b70a500a1750" path="/var/lib/kubelet/pods/c3b3b793-6c86-491f-aa04-b70a500a1750/volumes" Mar 12 14:09:49.087330 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:49.087317 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d" path="/var/lib/kubelet/pods/c5dbaac5-b359-4bfa-bf6a-98abf9f7cb2d/volumes" Mar 12 14:09:49.087625 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:49.087612 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e342fb05-ff4e-4093-958f-ff1a18a0af01" path="/var/lib/kubelet/pods/e342fb05-ff4e-4093-958f-ff1a18a0af01/volumes" Mar 12 14:09:49.087905 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:49.087893 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc74be50-d2f5-4b6e-959f-1b2dff5d5e11" path="/var/lib/kubelet/pods/fc74be50-d2f5-4b6e-959f-1b2dff5d5e11/volumes" Mar 12 14:09:54.160720 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.160693 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:54.281412 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.281319 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7ghw\" (UniqueName: \"kubernetes.io/projected/7370775c-2b21-4616-a050-8079030af7c1-kube-api-access-w7ghw\") pod \"7370775c-2b21-4616-a050-8079030af7c1\" (UID: \"7370775c-2b21-4616-a050-8079030af7c1\") " Mar 12 14:09:54.283628 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.283603 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7370775c-2b21-4616-a050-8079030af7c1-kube-api-access-w7ghw" (OuterVolumeSpecName: "kube-api-access-w7ghw") pod "7370775c-2b21-4616-a050-8079030af7c1" (UID: "7370775c-2b21-4616-a050-8079030af7c1"). InnerVolumeSpecName "kube-api-access-w7ghw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 14:09:54.382573 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.382542 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w7ghw\" (UniqueName: \"kubernetes.io/projected/7370775c-2b21-4616-a050-8079030af7c1-kube-api-access-w7ghw\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 14:09:54.411841 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.411808 2576 generic.go:358] "Generic (PLEG): container finished" podID="7370775c-2b21-4616-a050-8079030af7c1" containerID="b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53" exitCode=0 Mar 12 14:09:54.412038 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.411868 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" Mar 12 14:09:54.412038 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.411891 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" event={"ID":"7370775c-2b21-4616-a050-8079030af7c1","Type":"ContainerDied","Data":"b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53"} Mar 12 14:09:54.412038 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.411936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts" event={"ID":"7370775c-2b21-4616-a050-8079030af7c1","Type":"ContainerDied","Data":"039f632824fc79d9f83a0a68da0c745c0944da5b8a6c43246e306a46e7002d81"} Mar 12 14:09:54.412038 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.411954 2576 scope.go:117] "RemoveContainer" containerID="b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53" Mar 12 14:09:54.421921 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.421898 2576 scope.go:117] "RemoveContainer" containerID="b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53" Mar 12 14:09:54.422247 ip-10-0-142-16 kubenswrapper[2576]: E0312 14:09:54.422228 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53\": container with ID starting with b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53 not found: ID does not exist" containerID="b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53" Mar 12 14:09:54.422297 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.422257 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53"} err="failed to get container status \"b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53\": rpc error: code = NotFound desc = could not find container \"b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53\": container with ID starting with b06d60762a921d6c020a1de48fc81b443bf0fbcd964464da28cb70b37be53e53 not found: ID does not exist" Mar 12 14:09:54.433507 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.433436 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts"] Mar 12 14:09:54.435500 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:54.435451 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-prestop-hook-node-0-0-bh2ts"] Mar 12 14:09:55.085876 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:55.085848 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7370775c-2b21-4616-a050-8079030af7c1" path="/var/lib/kubelet/pods/7370775c-2b21-4616-a050-8079030af7c1/volumes" Mar 12 14:09:58.370119 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.370073 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" podUID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" containerName="node" probeResult="failure" output="Get \"http://10.132.0.39:28080/metrics\": dial tcp 10.132.0.39:28080: connect: connection refused" Mar 12 14:09:58.436677 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.436627 2576 generic.go:358] "Generic (PLEG): container finished" podID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" containerID="d539fa26bf94cbc61db986041c9a8a5196a52f5184f0919b1133daf782880069" exitCode=0 Mar 12 14:09:58.436888 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.436701 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" event={"ID":"8d660fa5-e41a-45d3-8a04-cd0473ef344b","Type":"ContainerDied","Data":"d539fa26bf94cbc61db986041c9a8a5196a52f5184f0919b1133daf782880069"} Mar 12 14:09:58.544897 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.544865 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:58.618945 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.618836 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whplh\" (UniqueName: \"kubernetes.io/projected/8d660fa5-e41a-45d3-8a04-cd0473ef344b-kube-api-access-whplh\") pod \"8d660fa5-e41a-45d3-8a04-cd0473ef344b\" (UID: \"8d660fa5-e41a-45d3-8a04-cd0473ef344b\") " Mar 12 14:09:58.621154 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.621114 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d660fa5-e41a-45d3-8a04-cd0473ef344b-kube-api-access-whplh" (OuterVolumeSpecName: "kube-api-access-whplh") pod "8d660fa5-e41a-45d3-8a04-cd0473ef344b" (UID: "8d660fa5-e41a-45d3-8a04-cd0473ef344b"). InnerVolumeSpecName "kube-api-access-whplh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 12 14:09:58.720086 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:58.720042 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-whplh\" (UniqueName: \"kubernetes.io/projected/8d660fa5-e41a-45d3-8a04-cd0473ef344b-kube-api-access-whplh\") on node \"ip-10-0-142-16.ec2.internal\" DevicePath \"\"" Mar 12 14:09:59.442252 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:59.442219 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" Mar 12 14:09:59.442252 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:59.442241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45" event={"ID":"8d660fa5-e41a-45d3-8a04-cd0473ef344b","Type":"ContainerDied","Data":"ce176035908bc513810a7e678c2f191a7a899e680a135ea8670539939b47b5fe"} Mar 12 14:09:59.442813 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:59.442291 2576 scope.go:117] "RemoveContainer" containerID="d539fa26bf94cbc61db986041c9a8a5196a52f5184f0919b1133daf782880069" Mar 12 14:09:59.461221 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:59.461187 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45"] Mar 12 14:09:59.464819 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:09:59.464789 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-nlrws/progression-multi-container-node-0-0-bbk45"] Mar 12 14:10:01.086642 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:01.086605 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" path="/var/lib/kubelet/pods/8d660fa5-e41a-45d3-8a04-cd0473ef344b/volumes" Mar 12 14:10:41.122397 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:41.122319 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x4rlt_05421762-de84-4ca6-bedf-542c55460252/global-pull-secret-syncer/0.log" Mar 12 14:10:41.238059 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:41.238026 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-w7qxr_7c9fffc5-b110-46de-8d20-728d29e23748/konnectivity-agent/0.log" Mar 12 14:10:41.306395 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:41.306364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-16.ec2.internal_6158045a7611177883f6a632f015315d/haproxy/0.log" Mar 12 14:10:44.172161 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.172121 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-b58cd5d8d-vw6kv_f25d8f1b-e18e-4880-899f-71ae7bff54be/cluster-monitoring-operator/0.log" Mar 12 14:10:44.202391 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.202364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-f2ksk_35520300-1852-426c-b98c-0d7a5de8f5d1/kube-state-metrics/0.log" Mar 12 14:10:44.234526 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.234492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-f2ksk_35520300-1852-426c-b98c-0d7a5de8f5d1/kube-rbac-proxy-main/0.log" Mar 12 14:10:44.262738 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.262709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-6df7999c47-f2ksk_35520300-1852-426c-b98c-0d7a5de8f5d1/kube-rbac-proxy-self/0.log" Mar 12 14:10:44.305433 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.305410 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-649779b44-9xxmf_e434420d-6db4-40f3-8792-88d3b03b0e31/metrics-server/0.log" Mar 12 14:10:44.338392 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.338354 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-6d47bdb78d-jvgbr_56935d74-a7ce-4152-8c4e-752ce08b5343/monitoring-plugin/0.log" Mar 12 14:10:44.373496 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.373469 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-22tl6_0c2c8159-b212-44a8-bcfa-0dd001c47b97/node-exporter/0.log" Mar 12 14:10:44.399635 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.399598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-22tl6_0c2c8159-b212-44a8-bcfa-0dd001c47b97/kube-rbac-proxy/0.log" Mar 12 14:10:44.426006 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.425933 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-22tl6_0c2c8159-b212-44a8-bcfa-0dd001c47b97/init-textfile/0.log" Mar 12 14:10:44.622188 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.622159 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-8bgkd_8a686a49-082d-4561-b5c9-4bf4c4cc0943/kube-rbac-proxy-main/0.log" Mar 12 14:10:44.646665 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.646636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-8bgkd_8a686a49-082d-4561-b5c9-4bf4c4cc0943/kube-rbac-proxy-self/0.log" Mar 12 14:10:44.673724 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.673700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-68b5d5d464-8bgkd_8a686a49-082d-4561-b5c9-4bf4c4cc0943/openshift-state-metrics/0.log" Mar 12 14:10:44.718519 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.718498 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/prometheus/0.log" Mar 12 14:10:44.743351 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.743328 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/config-reloader/0.log" Mar 12 14:10:44.768800 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.768770 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/thanos-sidecar/0.log" Mar 12 14:10:44.794231 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.794206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/kube-rbac-proxy-web/0.log" Mar 12 14:10:44.822589 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.822551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/kube-rbac-proxy/0.log" Mar 12 14:10:44.850886 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.850846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/kube-rbac-proxy-thanos/0.log" Mar 12 14:10:44.878017 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.877978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_46214ae2-1cfa-409e-822c-adac5e4d0f1c/init-config-reloader/0.log" Mar 12 14:10:44.909203 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.909172 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-rbppk_799696f8-16a2-4084-b56e-ea433f73f682/prometheus-operator/0.log" Mar 12 14:10:44.933897 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.933868 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-6b948c769-rbppk_799696f8-16a2-4084-b56e-ea433f73f682/kube-rbac-proxy/0.log" Mar 12 14:10:44.961890 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:44.961862 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8444df798b-8smw6_ef8299ba-434c-47bc-9ea7-ed35925f508a/prometheus-operator-admission-webhook/0.log" Mar 12 14:10:45.105802 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.105774 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d66bcc7f4-l7sjt_8ce11cd6-c7dc-489b-8f06-caf49da9ae97/telemeter-client/0.log" Mar 12 14:10:45.133903 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.133877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d66bcc7f4-l7sjt_8ce11cd6-c7dc-489b-8f06-caf49da9ae97/reload/0.log" Mar 12 14:10:45.158638 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.158608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5d66bcc7f4-l7sjt_8ce11cd6-c7dc-489b-8f06-caf49da9ae97/kube-rbac-proxy/0.log" Mar 12 14:10:45.192530 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.192492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b6c584c7d-tmkl9_053e32cf-a4f5-4fbb-a8f3-df217ce3e68d/thanos-query/0.log" Mar 12 14:10:45.193082 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.193063 2576 scope.go:117] "RemoveContainer" containerID="d74a7d998a75fd0414d3e158075e266d4c3d881baf0da93cf7aa7e03312d89ce" Mar 12 14:10:45.202066 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.202046 2576 scope.go:117] "RemoveContainer" containerID="1f68251bf82f4d24733f858b4ea119abc9389291f699d67ccf2dbe8ae6eb2831" Mar 12 14:10:45.215089 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.215070 2576 scope.go:117] "RemoveContainer" containerID="b3b9cbddbe653e90c8a75578bbf44845a16c9643eaddeee5d416c21c6633d737" Mar 12 14:10:45.219573 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.219549 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b6c584c7d-tmkl9_053e32cf-a4f5-4fbb-a8f3-df217ce3e68d/kube-rbac-proxy-web/0.log" Mar 12 14:10:45.227799 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.227781 2576 scope.go:117] "RemoveContainer" containerID="21dbcb5856ad693c0bd105cbb7a763c83a31d78d44a3e0a6db70cc8c1e832f9d" Mar 12 14:10:45.248683 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.248657 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b6c584c7d-tmkl9_053e32cf-a4f5-4fbb-a8f3-df217ce3e68d/kube-rbac-proxy/0.log" Mar 12 14:10:45.274514 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.274486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b6c584c7d-tmkl9_053e32cf-a4f5-4fbb-a8f3-df217ce3e68d/prom-label-proxy/0.log" Mar 12 14:10:45.299162 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.299136 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b6c584c7d-tmkl9_053e32cf-a4f5-4fbb-a8f3-df217ce3e68d/kube-rbac-proxy-rules/0.log" Mar 12 14:10:45.323931 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:45.323899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b6c584c7d-tmkl9_053e32cf-a4f5-4fbb-a8f3-df217ce3e68d/kube-rbac-proxy-metrics/0.log" Mar 12 14:10:46.491655 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:46.491623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-55b77584bb-dx5pp_12cccf55-aade-4101-9250-96d5b498a6af/networking-console-plugin/0.log" Mar 12 14:10:46.907361 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:46.907313 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/2.log" Mar 12 14:10:46.912494 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:46.912452 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-76b8565867-8c2br_b87ad3f0-c310-4833-bfac-cd1e3b15a7d9/console-operator/3.log" Mar 12 14:10:47.304778 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.304740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-5b85974fd6-sqphd_4dac3f26-1539-4a57-8570-dba478ffb63f/download-server/0.log" Mar 12 14:10:47.702694 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.702654 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659"] Mar 12 14:10:47.703335 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.703263 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7370775c-2b21-4616-a050-8079030af7c1" containerName="node" Mar 12 14:10:47.703335 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.703288 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7370775c-2b21-4616-a050-8079030af7c1" containerName="node" Mar 12 14:10:47.703335 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.703318 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" containerName="node" Mar 12 14:10:47.703335 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.703327 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" containerName="node" Mar 12 14:10:47.703581 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.703410 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7370775c-2b21-4616-a050-8079030af7c1" containerName="node" Mar 12 14:10:47.703581 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.703421 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d660fa5-e41a-45d3-8a04-cd0473ef344b" containerName="node" Mar 12 14:10:47.708103 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.708074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.710874 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.710848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nssb5\"/\"kube-root-ca.crt\"" Mar 12 14:10:47.712045 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.712023 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nssb5\"/\"openshift-service-ca.crt\"" Mar 12 14:10:47.712160 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.712034 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nssb5\"/\"default-dockercfg-5x56g\"" Mar 12 14:10:47.714843 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.714821 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-67fdcb5769-4j9p7_8571836d-0cec-490b-a68b-8ecee11112d1/volume-data-source-validator/0.log" Mar 12 14:10:47.715557 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.715533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659"] Mar 12 14:10:47.764757 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.764715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-sys\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.764925 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.764766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcn5z\" (UniqueName: \"kubernetes.io/projected/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-kube-api-access-qcn5z\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.764925 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.764837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-podres\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.764925 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.764886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-proc\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.765083 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.764953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-lib-modules\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.865965 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.865926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-sys\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.865973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qcn5z\" (UniqueName: \"kubernetes.io/projected/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-kube-api-access-qcn5z\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.865999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-podres\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.866018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-proc\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.866062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-lib-modules\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.866085 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-sys\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.866129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-podres\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866168 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.866158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-proc\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.866532 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.866224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-lib-modules\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:47.874230 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:47.874205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcn5z\" (UniqueName: \"kubernetes.io/projected/eb80c523-1d2b-45f5-89c3-ac9c01dbd67f-kube-api-access-qcn5z\") pod \"perf-node-gather-daemonset-h7659\" (UID: \"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f\") " pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:48.020991 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.020890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:48.151081 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.151054 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659"] Mar 12 14:10:48.153076 ip-10-0-142-16 kubenswrapper[2576]: W0312 14:10:48.153042 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podeb80c523_1d2b_45f5_89c3_ac9c01dbd67f.slice/crio-410d188546987fb4a2304f83dc1af80ee2936bbdd903168bee7df011b9829cdf WatchSource:0}: Error finding container 410d188546987fb4a2304f83dc1af80ee2936bbdd903168bee7df011b9829cdf: Status 404 returned error can't find the container with id 410d188546987fb4a2304f83dc1af80ee2936bbdd903168bee7df011b9829cdf Mar 12 14:10:48.404032 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.404011 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ddks7_99423593-d38b-4662-bacc-b1f6bb3d0d90/dns/0.log" Mar 12 14:10:48.429296 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.429270 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ddks7_99423593-d38b-4662-bacc-b1f6bb3d0d90/kube-rbac-proxy/0.log" Mar 12 14:10:48.569066 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.569034 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lc7zn_8a746e42-a15c-4c61-b754-446d49beeae9/dns-node-resolver/0.log" Mar 12 14:10:48.634410 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.634368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" event={"ID":"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f","Type":"ContainerStarted","Data":"e5c2b245c2307c97a3c2cfe426bb9efa9d5f33bd1b79106ae09b1bb0e53b45c3"} Mar 12 14:10:48.634410 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.634408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" event={"ID":"eb80c523-1d2b-45f5-89c3-ac9c01dbd67f","Type":"ContainerStarted","Data":"410d188546987fb4a2304f83dc1af80ee2936bbdd903168bee7df011b9829cdf"} Mar 12 14:10:48.634832 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.634810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:48.652229 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:48.652179 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" podStartSLOduration=1.6521635479999999 podStartE2EDuration="1.652163548s" podCreationTimestamp="2026-03-12 14:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 14:10:48.650753453 +0000 UTC m=+1984.162687481" watchObservedRunningTime="2026-03-12 14:10:48.652163548 +0000 UTC m=+1984.164097575" Mar 12 14:10:49.055742 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:49.055700 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p28bf_1aca7d52-35bb-4dbf-8bb1-4a032145c336/node-ca/0.log" Mar 12 14:10:49.759995 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:49.759966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-f747dc4bb-qfvq7_c11ab84b-7d66-4e49-a456-66c0ee53f865/router/0.log" Mar 12 14:10:50.170717 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:50.170687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v9hjg_92914013-06d0-470c-b843-48a641e447a6/serve-healthcheck-canary/0.log" Mar 12 14:10:50.534294 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:50.534195 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-q8flt_16e1b1af-b744-44fd-a3d5-c817249f6767/insights-operator/0.log" Mar 12 14:10:50.534667 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:50.534644 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-76bdd9f478-q8flt_16e1b1af-b744-44fd-a3d5-c817249f6767/insights-operator/1.log" Mar 12 14:10:50.710697 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:50.710670 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpc6g_035b3ed4-c514-4d2a-a1a9-e485a1c21535/kube-rbac-proxy/0.log" Mar 12 14:10:50.763507 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:50.763445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpc6g_035b3ed4-c514-4d2a-a1a9-e485a1c21535/exporter/0.log" Mar 12 14:10:50.815218 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:50.815139 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rpc6g_035b3ed4-c514-4d2a-a1a9-e485a1c21535/extractor/0.log" Mar 12 14:10:54.649409 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:54.649379 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nssb5/perf-node-gather-daemonset-h7659" Mar 12 14:10:56.052042 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:56.052010 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-wlnh5_20ecc6d8-57fe-48c1-bbcf-886289ff3155/kube-storage-version-migrator-operator/1.log" Mar 12 14:10:56.052826 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:56.052807 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-866f46547-wlnh5_20ecc6d8-57fe-48c1-bbcf-886289ff3155/kube-storage-version-migrator-operator/0.log" Mar 12 14:10:57.148906 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.148878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/kube-multus-additional-cni-plugins/0.log" Mar 12 14:10:57.175664 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.175635 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/egress-router-binary-copy/0.log" Mar 12 14:10:57.202400 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.202367 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/cni-plugins/0.log" Mar 12 14:10:57.228697 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.228666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/bond-cni-plugin/0.log" Mar 12 14:10:57.256399 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.256357 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/routeoverride-cni/0.log" Mar 12 14:10:57.281992 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.281911 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/whereabouts-cni-bincopy/0.log" Mar 12 14:10:57.306442 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.306418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6wcbt_9288d2b3-d104-4c0d-8417-6d70d1c70098/whereabouts-cni/0.log" Mar 12 14:10:57.755491 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.755443 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gd9sq_b3dfbf90-71a8-4aa2-a65c-3f9e522e8d39/kube-multus/0.log" Mar 12 14:10:57.865390 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.865359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wbtg8_37e0d595-d2d4-4caf-8700-ffaeb5c3401f/network-metrics-daemon/0.log" Mar 12 14:10:57.887102 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:57.887073 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wbtg8_37e0d595-d2d4-4caf-8700-ffaeb5c3401f/kube-rbac-proxy/0.log" Mar 12 14:10:58.632550 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.632521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/ovn-controller/0.log" Mar 12 14:10:58.659830 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.659797 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/ovn-acl-logging/0.log" Mar 12 14:10:58.682738 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.682654 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/kube-rbac-proxy-node/0.log" Mar 12 14:10:58.708475 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.708435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/kube-rbac-proxy-ovn-metrics/0.log" Mar 12 14:10:58.733351 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.733324 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/northd/0.log" Mar 12 14:10:58.759068 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.759015 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/nbdb/0.log" Mar 12 14:10:58.785020 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.784992 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/sbdb/0.log" Mar 12 14:10:58.900646 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:10:58.900610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8qtnl_e5825179-9ef4-493d-9fc5-0868f3084a95/ovnkube-controller/0.log" Mar 12 14:11:00.342302 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:11:00.342265 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-cc88fdd44-z42dm_44b94cdd-3ad4-4d6b-a6a2-2eedc6664c77/check-endpoints/0.log" Mar 12 14:11:00.392134 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:11:00.392106 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fnfhc_a2b57a5c-1736-4008-9bcc-41669382f70a/network-check-target-container/0.log" Mar 12 14:11:01.330797 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:11:01.330769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-trfvm_a3207bc4-cf48-4af8-b377-bbd83ed2fe47/iptables-alerter/0.log" Mar 12 14:11:02.085230 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:11:02.085199 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-rjb6h_bf0068fb-0001-44e4-9194-485a2618370f/tuned/0.log" Mar 12 14:11:03.909772 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:11:03.909736 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-d5df4776c-clm4m_f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca/cluster-samples-operator/0.log" Mar 12 14:11:03.928780 ip-10-0-142-16 kubenswrapper[2576]: I0312 14:11:03.928756 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-d5df4776c-clm4m_f3b28d7a-a6c7-4e0f-98a0-df655c6df5ca/cluster-samples-operator-watch/0.log"