Apr 21 15:10:31.292638 ip-10-0-131-11 systemd[1]: Starting Kubernetes Kubelet... Apr 21 15:10:31.729285 ip-10-0-131-11 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:31.729285 ip-10-0-131-11 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 15:10:31.729285 ip-10-0-131-11 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:31.729285 ip-10-0-131-11 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 15:10:31.729285 ip-10-0-131-11 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 15:10:31.730687 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.730600 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 15:10:31.733657 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733641 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:31.733657 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733656 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:31.733719 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733661 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:31.733719 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733706 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:31.733815 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733806 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733816 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733820 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733825 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733829 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733833 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733837 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733841 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:31.733845 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733846 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:31.734179 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.733851 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:31.734857 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734842 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:31.734857 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734853 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:31.734857 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734858 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734868 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734872 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734876 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734881 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734884 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734888 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734896 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734902 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734906 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734969 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734991 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:31.734992 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.734996 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735000 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735004 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735007 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735010 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735014 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735017 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735020 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735023 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735026 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735038 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735165 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735173 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735178 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735182 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735186 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735190 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735193 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735197 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735201 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:31.735282 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735205 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735213 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735217 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735222 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735225 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735229 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735233 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735238 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735242 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735245 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735250 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735254 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735257 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735261 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735266 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735271 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735275 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735279 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735283 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:31.735781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735290 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735294 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735298 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735303 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735307 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735311 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735315 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735319 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735323 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735328 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735331 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735336 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735339 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735342 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735344 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735347 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735350 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735352 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:31.736240 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.735355 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:31.737100 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737088 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:31.737100 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737100 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737104 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737107 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737111 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737114 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737117 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737120 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737124 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737126 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737129 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737133 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737138 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737141 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737144 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737147 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737150 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737152 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737155 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737158 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:31.737155 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737160 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737164 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737168 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737172 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737175 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737178 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737181 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737184 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737187 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737189 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737191 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737194 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737196 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737199 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737201 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737203 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737206 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737208 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737211 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:31.737632 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737213 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737216 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737218 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737221 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737224 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737226 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737228 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737231 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737233 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737235 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737238 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737240 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737243 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737245 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737247 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737251 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737253 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737256 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737258 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737261 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:31.738132 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737263 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737266 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737269 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737271 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737274 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737276 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737279 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737281 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737284 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737287 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737289 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737292 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737294 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737296 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737299 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737301 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737304 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737306 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737309 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737311 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:31.738659 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737314 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737316 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737319 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737321 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737323 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737326 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.737328 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737428 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737436 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737442 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737446 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737451 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737454 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737459 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737472 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737476 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737479 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737482 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737485 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737488 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737491 2575 flags.go:64] FLAG: --cgroup-root="" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737494 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737497 2575 flags.go:64] FLAG: --client-ca-file="" Apr 21 15:10:31.739159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737500 2575 flags.go:64] FLAG: --cloud-config="" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737502 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737505 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737509 2575 flags.go:64] FLAG: --cluster-domain="" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737512 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737515 2575 flags.go:64] FLAG: --config-dir="" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737518 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737521 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737525 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737528 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737531 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737534 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737537 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737540 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737543 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737546 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737550 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737554 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737558 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737560 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737563 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737566 2575 flags.go:64] FLAG: --enable-server="true" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737569 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737573 2575 flags.go:64] FLAG: --event-burst="100" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737577 2575 flags.go:64] FLAG: --event-qps="50" Apr 21 15:10:31.739744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737580 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737583 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737586 2575 flags.go:64] FLAG: --eviction-hard="" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737589 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737592 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737595 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737598 2575 flags.go:64] FLAG: --eviction-soft="" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737601 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737604 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737606 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737609 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737612 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737615 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737617 2575 flags.go:64] FLAG: --feature-gates="" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737621 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737624 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737627 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737630 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737634 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737637 2575 flags.go:64] FLAG: --help="false" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737640 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-131-11.ec2.internal" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737643 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737646 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 15:10:31.740332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737650 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737653 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737656 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737659 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737662 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737665 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737667 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737670 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737674 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737677 2575 flags.go:64] FLAG: --kube-reserved="" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737680 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737682 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737685 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737688 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737691 2575 flags.go:64] FLAG: --lock-file="" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737694 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737696 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737699 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737705 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737708 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737711 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737713 2575 flags.go:64] FLAG: --logging-format="text" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737716 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737719 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 15:10:31.740906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737722 2575 flags.go:64] FLAG: --manifest-url="" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737725 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737730 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737732 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737740 2575 flags.go:64] FLAG: --max-pods="110" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737743 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737745 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737748 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737751 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737754 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737758 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737760 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737767 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737770 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737773 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737776 2575 flags.go:64] FLAG: --pod-cidr="" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737779 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737784 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737787 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737790 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737793 2575 flags.go:64] FLAG: --port="10250" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737796 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737799 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a04adf746967bdfa" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737802 2575 flags.go:64] FLAG: --qos-reserved="" Apr 21 15:10:31.741505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737805 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737808 2575 flags.go:64] FLAG: --register-node="true" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737810 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737813 2575 flags.go:64] FLAG: --register-with-taints="" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737816 2575 flags.go:64] FLAG: --registry-burst="10" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737819 2575 flags.go:64] FLAG: --registry-qps="5" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737822 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737825 2575 flags.go:64] FLAG: --reserved-memory="" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737829 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737832 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737835 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737837 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737840 2575 flags.go:64] FLAG: --runonce="false" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737843 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737846 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737849 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737852 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737855 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737858 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737861 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737864 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737867 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737870 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737873 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737875 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737878 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 15:10:31.742069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737881 2575 flags.go:64] FLAG: --system-cgroups="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737884 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737890 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737892 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737895 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737901 2575 flags.go:64] FLAG: --tls-min-version="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737904 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737906 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737909 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737912 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737915 2575 flags.go:64] FLAG: --v="2" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737919 2575 flags.go:64] FLAG: --version="false" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737923 2575 flags.go:64] FLAG: --vmodule="" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737927 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.737930 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738016 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738019 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738023 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738026 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738030 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738033 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738035 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738038 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:31.742830 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738041 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738044 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738047 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738049 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738052 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738055 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738057 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738059 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738062 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738064 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738067 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738070 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738073 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738075 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738082 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738085 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738087 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738093 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738096 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738099 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:31.743401 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738101 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738104 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738106 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738108 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738111 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738113 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738116 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738119 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738121 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738124 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738126 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738129 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738132 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738135 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738137 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738140 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738143 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738145 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738148 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738150 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:31.743897 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738153 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738155 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738157 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738161 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738165 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738168 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738172 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738174 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738178 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738182 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738185 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738187 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738189 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738192 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738194 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738197 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738200 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738202 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738205 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:31.744399 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738207 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738210 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738212 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738214 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738217 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738222 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738224 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738227 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738229 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738232 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738234 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738237 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738239 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738242 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738244 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738247 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738249 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738252 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:31.744866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.738257 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:31.745311 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.739468 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:31.745816 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.745796 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 15:10:31.745847 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.745817 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 15:10:31.745873 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745865 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:31.745873 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745870 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:31.745873 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745873 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745876 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745879 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745882 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745884 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745887 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745889 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745892 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745894 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745897 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745899 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745903 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745905 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745907 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745910 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745913 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745917 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745920 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745923 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:31.745952 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745925 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745927 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745930 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745932 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745935 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745937 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745940 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745943 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745946 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745948 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745951 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745953 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745956 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745959 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745962 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745964 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745967 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745970 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745972 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745975 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:31.746457 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745977 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745979 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745982 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745985 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745987 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745990 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745993 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745995 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.745999 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746002 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746005 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746008 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746011 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746014 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746018 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746021 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746023 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746026 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746028 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:31.746966 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746031 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746034 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746036 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746039 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746041 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746044 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746046 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746048 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746051 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746053 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746056 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746058 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746061 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746064 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746066 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746069 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746072 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746074 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746077 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746080 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:31.747436 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746082 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746084 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746087 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746089 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746091 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746094 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.746099 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746194 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746198 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746201 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746204 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746207 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746210 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746212 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746216 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 15:10:31.747923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746218 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746221 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746223 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746226 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746228 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746230 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746233 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746235 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746238 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746240 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746242 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746245 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746247 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746249 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746252 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746254 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746257 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746259 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746261 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746264 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 15:10:31.748303 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746266 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746268 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746271 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746273 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746276 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746278 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746280 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746283 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746285 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746288 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746290 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746293 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746296 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746299 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746301 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746304 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746306 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746308 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746311 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746313 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 15:10:31.748797 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746316 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746318 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746330 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746334 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746338 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746341 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746343 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746346 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746348 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746351 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746354 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746356 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746358 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746361 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746363 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746366 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746383 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746386 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746389 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 21 15:10:31.749279 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746391 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746393 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746396 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746399 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746402 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746404 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746408 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746411 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746413 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746416 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746418 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746420 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746423 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746425 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746427 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746431 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746434 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746436 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 15:10:31.749866 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:31.746439 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 15:10:31.750289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.746444 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 15:10:31.750289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.747078 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 15:10:31.750289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.748970 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 15:10:31.750289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.749827 2575 server.go:1019] "Starting client certificate rotation" Apr 21 15:10:31.750289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.749922 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:10:31.750289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.749960 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 15:10:31.772488 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.772470 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:10:31.776586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.776573 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 15:10:31.791588 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.791556 2575 log.go:25] "Validated CRI v1 runtime API" Apr 21 15:10:31.797341 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.797328 2575 log.go:25] "Validated CRI v1 image API" Apr 21 15:10:31.798516 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.798496 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 15:10:31.801289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.801267 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9d7cb677-0a91-45ba-b33c-dbffee6559b8:/dev/nvme0n1p4 a6ba09a8-3daf-4d48-9fe5-0ad4c210b7bc:/dev/nvme0n1p3] Apr 21 15:10:31.801365 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.801289 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 15:10:31.804504 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.804487 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:10:31.807923 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.807817 2575 manager.go:217] Machine: {Timestamp:2026-04-21 15:10:31.805723319 +0000 UTC m=+0.401294187 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3089380 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27157e796820da38f18057f39db1d2 SystemUUID:ec27157e-7968-20da-38f1-8057f39db1d2 BootID:2fa0d98b-c28c-49ad-8650-b619509c0217 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b6:de:ff:ce:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b6:de:ff:ce:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:4a:75:af:ac:43 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 15:10:31.807923 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.807917 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 15:10:31.808052 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.807996 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 15:10:31.810242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.810221 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 15:10:31.810409 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.810245 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-11.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 15:10:31.810455 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.810419 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 15:10:31.810455 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.810428 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 15:10:31.810455 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.810441 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:10:31.812562 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.812551 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 15:10:31.814265 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.814254 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:10:31.814397 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.814388 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 15:10:31.817732 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.817721 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 21 15:10:31.817772 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.817742 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 15:10:31.817772 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.817757 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 15:10:31.817772 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.817765 2575 kubelet.go:397] "Adding apiserver pod source" Apr 21 15:10:31.817902 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.817773 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 15:10:31.818731 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.818717 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:10:31.818769 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.818744 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 15:10:31.822974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.822957 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 15:10:31.824192 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.824177 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 15:10:31.826940 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826927 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826946 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826952 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826967 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826976 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826985 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826992 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.826998 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 15:10:31.827001 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.827005 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 15:10:31.827209 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.827011 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 15:10:31.827209 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.827024 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 15:10:31.827209 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.827033 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 15:10:31.828445 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.828430 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 15:10:31.828484 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.828448 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 15:10:31.830770 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.830731 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 15:10:31.830770 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.830761 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-11.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 15:10:31.831983 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.831969 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 15:10:31.832034 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.832005 2575 server.go:1295] "Started kubelet" Apr 21 15:10:31.832127 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.832101 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 15:10:31.832258 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.832094 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 15:10:31.832308 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.832298 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 15:10:31.832781 ip-10-0-131-11 systemd[1]: Started Kubernetes Kubelet. Apr 21 15:10:31.833328 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.833290 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 15:10:31.834860 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.834841 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 21 15:10:31.838021 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.838005 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 15:10:31.838114 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.838020 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 15:10:31.842438 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.840040 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:31.842438 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.840465 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 15:10:31.842438 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.840508 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 15:10:31.842438 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.840552 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 15:10:31.842438 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.840853 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 21 15:10:31.842438 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.840862 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843432 2575 factory.go:55] Registering systemd factory Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843459 2575 factory.go:223] Registration of the systemd container factory successfully Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843736 2575 factory.go:153] Registering CRI-O factory Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843756 2575 factory.go:223] Registration of the crio container factory successfully Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843806 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843847 2575 factory.go:103] Registering Raw factory Apr 21 15:10:31.844242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.843863 2575 manager.go:1196] Started watching for new ooms in manager Apr 21 15:10:31.844749 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.844438 2575 manager.go:319] Starting recovery of all containers Apr 21 15:10:31.847340 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.847317 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-11.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 15:10:31.847340 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.847321 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-11.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 15:10:31.848889 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.848845 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 15:10:31.853228 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.853201 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 15:10:31.856999 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.856027 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-11.ec2.internal.18a867d854dc507d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-11.ec2.internal,UID:ip-10-0-131-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-11.ec2.internal,},FirstTimestamp:2026-04-21 15:10:31.831982205 +0000 UTC m=+0.427553070,LastTimestamp:2026-04-21 15:10:31.831982205 +0000 UTC m=+0.427553070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-11.ec2.internal,}" Apr 21 15:10:31.857123 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.857091 2575 manager.go:324] Recovery completed Apr 21 15:10:31.861458 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.861443 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:31.864587 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.864564 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:31.864657 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.864599 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:31.864657 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.864609 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:31.865056 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.865043 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 15:10:31.865056 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.865054 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 15:10:31.865165 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.865068 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 21 15:10:31.866611 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.866552 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-11.ec2.internal.18a867d856cdcf64 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-11.ec2.internal,UID:ip-10-0-131-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-11.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-11.ec2.internal,},FirstTimestamp:2026-04-21 15:10:31.864586084 +0000 UTC m=+0.460156950,LastTimestamp:2026-04-21 15:10:31.864586084 +0000 UTC m=+0.460156950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-11.ec2.internal,}" Apr 21 15:10:31.867496 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.867484 2575 policy_none.go:49] "None policy: Start" Apr 21 15:10:31.867586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.867501 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 15:10:31.867586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.867511 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 21 15:10:31.872241 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.872152 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kq5c9" Apr 21 15:10:31.878561 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.878497 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-11.ec2.internal.18a867d856ce14be default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-11.ec2.internal,UID:ip-10-0-131-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-131-11.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-131-11.ec2.internal,},FirstTimestamp:2026-04-21 15:10:31.864603838 +0000 UTC m=+0.460174704,LastTimestamp:2026-04-21 15:10:31.864603838 +0000 UTC m=+0.460174704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-11.ec2.internal,}" Apr 21 15:10:31.887867 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.887805 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-11.ec2.internal.18a867d856ce38e9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-11.ec2.internal,UID:ip-10-0-131-11.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-131-11.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-131-11.ec2.internal,},FirstTimestamp:2026-04-21 15:10:31.864613097 +0000 UTC m=+0.460183963,LastTimestamp:2026-04-21 15:10:31.864613097 +0000 UTC m=+0.460183963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-11.ec2.internal,}" Apr 21 15:10:31.889800 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.889784 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kq5c9" Apr 21 15:10:31.906939 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.906924 2575 manager.go:341] "Starting Device Plugin manager" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.906957 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.906971 2575 server.go:85] "Starting device plugin registration server" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.907206 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.907218 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.907305 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.907390 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.907402 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.907908 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 15:10:31.913637 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.907936 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:31.973400 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.973347 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 15:10:31.974742 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.974720 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 15:10:31.974742 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.974746 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 15:10:31.974897 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.974766 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 15:10:31.974897 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.974772 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 15:10:31.974897 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:31.974802 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 15:10:31.978365 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:31.978344 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:32.007654 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.007600 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:32.008342 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.008325 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:32.008429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.008362 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:32.008429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.008395 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:32.008429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.008424 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.017407 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.017393 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.017461 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.017414 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-11.ec2.internal\": node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.029756 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.029738 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.075744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.075702 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal"] Apr 21 15:10:32.075861 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.075791 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:32.076584 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.076568 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:32.076654 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.076602 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:32.076654 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.076616 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:32.077740 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.077729 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:32.077882 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.077868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.077918 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.077896 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:32.078464 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.078443 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:32.078464 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.078453 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:32.078586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.078472 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:32.078586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.078473 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:32.078586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.078489 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:32.078586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.078497 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:32.079603 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.079588 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.079671 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.079612 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 15:10:32.080241 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.080225 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientMemory" Apr 21 15:10:32.080346 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.080252 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 15:10:32.080346 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.080267 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeHasSufficientPID" Apr 21 15:10:32.107917 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.107893 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-11.ec2.internal\" not found" node="ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.112186 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.112169 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-11.ec2.internal\" not found" node="ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.130479 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.130459 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.142984 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.142963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f4a1f42e43969c311a11e30a08e26ff0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal\" (UID: \"f4a1f42e43969c311a11e30a08e26ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.143052 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.142987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4a1f42e43969c311a11e30a08e26ff0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal\" (UID: \"f4a1f42e43969c311a11e30a08e26ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.143052 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.143004 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f339e642e0590241739bde2677ce9dbf-config\") pod \"kube-apiserver-proxy-ip-10-0-131-11.ec2.internal\" (UID: \"f339e642e0590241739bde2677ce9dbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.230785 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.230753 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.243278 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.243258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f4a1f42e43969c311a11e30a08e26ff0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal\" (UID: \"f4a1f42e43969c311a11e30a08e26ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.243338 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.243282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4a1f42e43969c311a11e30a08e26ff0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal\" (UID: \"f4a1f42e43969c311a11e30a08e26ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.243338 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.243300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f339e642e0590241739bde2677ce9dbf-config\") pod \"kube-apiserver-proxy-ip-10-0-131-11.ec2.internal\" (UID: \"f339e642e0590241739bde2677ce9dbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.243458 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.243338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f339e642e0590241739bde2677ce9dbf-config\") pod \"kube-apiserver-proxy-ip-10-0-131-11.ec2.internal\" (UID: \"f339e642e0590241739bde2677ce9dbf\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.243458 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.243385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f4a1f42e43969c311a11e30a08e26ff0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal\" (UID: \"f4a1f42e43969c311a11e30a08e26ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.243458 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.243407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4a1f42e43969c311a11e30a08e26ff0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal\" (UID: \"f4a1f42e43969c311a11e30a08e26ff0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.331747 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.331677 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.409347 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.409312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.415026 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.415009 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" Apr 21 15:10:32.432595 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.432576 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.533249 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.533217 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.633941 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.633876 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.734610 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.734578 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.750200 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.750178 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 15:10:32.750405 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.750352 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 15:10:32.804992 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.804963 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:32.835413 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.835363 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:32.838545 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.838530 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 15:10:32.851504 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.851480 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 15:10:32.869185 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.869162 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xn7cr" Apr 21 15:10:32.877795 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.877775 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xn7cr" Apr 21 15:10:32.892144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.892094 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:05:31 +0000 UTC" deadline="2027-11-03 05:49:21.796404124 +0000 UTC" Apr 21 15:10:32.892144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:32.892116 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13454h38m48.904290646s" Apr 21 15:10:32.936455 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:32.936425 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:33.006647 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:33.006614 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4a1f42e43969c311a11e30a08e26ff0.slice/crio-a461d3a869be1929a868222288aa4aa966a195220aa20a2dcfdc389cf80ecea8 WatchSource:0}: Error finding container a461d3a869be1929a868222288aa4aa966a195220aa20a2dcfdc389cf80ecea8: Status 404 returned error can't find the container with id a461d3a869be1929a868222288aa4aa966a195220aa20a2dcfdc389cf80ecea8 Apr 21 15:10:33.006974 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:33.006950 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf339e642e0590241739bde2677ce9dbf.slice/crio-acf0c6f264387c7cbb4256beaf50af15e019d2101b94a03f819b60ae9bff8da4 WatchSource:0}: Error finding container acf0c6f264387c7cbb4256beaf50af15e019d2101b94a03f819b60ae9bff8da4: Status 404 returned error can't find the container with id acf0c6f264387c7cbb4256beaf50af15e019d2101b94a03f819b60ae9bff8da4 Apr 21 15:10:33.011198 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.011182 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:10:33.037119 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.037099 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-11.ec2.internal\" not found" Apr 21 15:10:33.109289 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.109269 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:33.138187 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.138156 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" Apr 21 15:10:33.158743 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.158682 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:10:33.159603 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.159590 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" Apr 21 15:10:33.173297 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.173276 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 15:10:33.244858 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.244830 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:33.818214 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.818132 2575 apiserver.go:52] "Watching apiserver" Apr 21 15:10:33.825382 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.825341 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 15:10:33.825755 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.825731 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-hfcw6","kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal","openshift-dns/node-resolver-hdbc9","openshift-multus/network-metrics-daemon-96snf","openshift-network-operator/iptables-alerter-rjhlf","openshift-ovn-kubernetes/ovnkube-node-mbnjg","kube-system/konnectivity-agent-rw7pj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc","openshift-cluster-node-tuning-operator/tuned-h47wt","openshift-image-registry/node-ca-4hwvs","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal","openshift-multus/multus-additional-cni-plugins-m4ccx","openshift-multus/multus-jkh68"] Apr 21 15:10:33.828235 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.828212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.828621 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.828576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.830334 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.830312 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:33.830520 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.830425 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:33.831219 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.831197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 15:10:33.831910 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.831485 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 15:10:33.831910 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.831571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-kf7kv\"" Apr 21 15:10:33.831910 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.831593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9rlwd\"" Apr 21 15:10:33.831910 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.831739 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.832139 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.831953 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.834734 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.834613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.835565 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.835547 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.837436 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.837420 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-596kz\"" Apr 21 15:10:33.837661 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.837642 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.837661 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.837656 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 15:10:33.837979 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.837867 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.838459 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.838174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:33.838459 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.838234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:33.838872 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.838831 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 15:10:33.839144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.839122 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 15:10:33.839144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.839137 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h92dl\"" Apr 21 15:10:33.839277 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.839242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 15:10:33.839332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.839300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 15:10:33.840456 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.839995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.840456 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.840216 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.841680 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.841209 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.841680 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.841329 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.842922 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.842461 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:33.844219 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.844201 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.844466 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.844446 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 15:10:33.844698 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.844681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845318 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845438 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845496 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845551 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kb2k2\"" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845643 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-g5fts\"" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4dxv9\"" Apr 21 15:10:33.845827 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845828 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.846152 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.846152 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.845968 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.846269 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.846248 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.847842 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.847822 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 15:10:33.848022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848005 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 15:10:33.848100 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848071 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 15:10:33.848220 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848204 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j4nnx\"" Apr 21 15:10:33.848303 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848288 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 15:10:33.848428 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848289 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 15:10:33.848513 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848481 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wpfg2\"" Apr 21 15:10:33.848513 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.848504 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 15:10:33.853053 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71299419-e249-4660-891c-24ba490f5c36-hosts-file\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.853214 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5xrg\" (UniqueName: \"kubernetes.io/projected/981aa28d-2a57-4f14-8411-4d80c9ed2911-kube-api-access-t5xrg\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.853344 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-var-lib-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-systemd-units\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-run-netns\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovnkube-config\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sc4x\" (UniqueName: \"kubernetes.io/projected/71299419-e249-4660-891c-24ba490f5c36-kube-api-access-4sc4x\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-slash\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853502 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-ovn\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-socket-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-sys\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853578 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-tuned\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bhw\" (UniqueName: \"kubernetes.io/projected/85bafbc7-2166-40e1-825d-81c20339ab1e-kube-api-access-54bhw\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.853626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71299419-e249-4660-891c-24ba490f5c36-tmp-dir\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853659 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovnkube-script-lib\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-systemd\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/85bafbc7-2166-40e1-825d-81c20339ab1e-tmp\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853724 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e-konnectivity-ca\") pod \"konnectivity-agent-rw7pj\" (UID: \"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e\") " pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-env-overrides\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmvqf\" (UniqueName: \"kubernetes.io/projected/0bdcacfa-6992-4542-8cbe-df76abfeb25b-kube-api-access-bmvqf\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-device-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.853821 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysconfig\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-kubernetes\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-etc-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-cni-netd\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853951 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.853976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4jwz\" (UniqueName: \"kubernetes.io/projected/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-kube-api-access-z4jwz\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-run\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-var-lib-kubelet\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/981aa28d-2a57-4f14-8411-4d80c9ed2911-iptables-alerter-script\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.854107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854127 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-registration-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854211 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-sys-fs\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-modprobe-d\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysctl-conf\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-kubelet\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovn-node-metrics-cert\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysctl-d\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-lib-modules\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-host\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-systemd\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-node-log\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854483 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-log-socket\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-cni-bin\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.854570 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854531 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwzgw\" (UniqueName: \"kubernetes.io/projected/a17c6c3f-25ab-4414-92a4-946230c882ea-kube-api-access-cwzgw\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:33.855150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854553 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/981aa28d-2a57-4f14-8411-4d80c9ed2911-host-slash\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.855150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e-agent-certs\") pod \"konnectivity-agent-rw7pj\" (UID: \"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e\") " pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.855150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.854600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:33.879907 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.879840 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:05:32 +0000 UTC" deadline="2027-11-18 05:52:41.31922817 +0000 UTC" Apr 21 15:10:33.879907 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.879870 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13814h42m7.439361852s" Apr 21 15:10:33.941997 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.941964 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 15:10:33.955437 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955410 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-system-cni-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/85bafbc7-2166-40e1-825d-81c20339ab1e-tmp\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-etc-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-cni-netd\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4jwz\" (UniqueName: \"kubernetes.io/projected/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-kube-api-access-z4jwz\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-etc-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.955606 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfb26da3-8175-4742-a038-7b5d5d082af2-serviceca\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955609 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-cni-netd\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-etc-selinux\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955745 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-cni-bin\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-registration-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-sys-fs\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysctl-conf\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-registration-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-system-cni-dir\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955963 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysctl-conf\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-sys-fs\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.955975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955928 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.955964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-os-release\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-kubelet\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-conf-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovn-node-metrics-cert\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysctl-d\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956170 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cnibin\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1596851-f71d-43c8-b7f4-f92f0a29bb06-cni-binary-copy\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-hostroot\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-run-netns\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-systemd\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-cni-bin\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysctl-d\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwzgw\" (UniqueName: \"kubernetes.io/projected/a17c6c3f-25ab-4414-92a4-946230c882ea-kube-api-access-cwzgw\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-run-netns\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-systemd\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.956635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-cni-bin\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956452 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-socket-dir-parent\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-multus-certs\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71299419-e249-4660-891c-24ba490f5c36-hosts-file\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5xrg\" (UniqueName: \"kubernetes.io/projected/981aa28d-2a57-4f14-8411-4d80c9ed2911-kube-api-access-t5xrg\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-var-lib-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/71299419-e249-4660-891c-24ba490f5c36-hosts-file\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-sys\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-var-lib-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-tuned\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqp9j\" (UniqueName: \"kubernetes.io/projected/cfb26da3-8175-4742-a038-7b5d5d082af2-kube-api-access-tqp9j\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-sys\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-slash\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54bhw\" (UniqueName: \"kubernetes.io/projected/85bafbc7-2166-40e1-825d-81c20339ab1e-kube-api-access-54bhw\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-slash\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-cni-multus\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.957429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-etc-kubernetes\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.956837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tk4\" (UniqueName: \"kubernetes.io/projected/c1596851-f71d-43c8-b7f4-f92f0a29bb06-kube-api-access-58tk4\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-device-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957070 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-kubernetes\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957082 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-device-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-run\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-var-lib-kubelet\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-openvswitch\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-kubernetes\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-var-lib-kubelet\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-run\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-os-release\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/981aa28d-2a57-4f14-8411-4d80c9ed2911-iptables-alerter-script\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958199 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-modprobe-d\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957551 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-netns\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-run-ovn-kubernetes\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-daemon-config\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-modprobe-d\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-kubelet\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-lib-modules\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957708 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-host-kubelet\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-host\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-host\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957752 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-systemd-units\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957799 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-lib-modules\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovnkube-config\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-systemd-units\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/981aa28d-2a57-4f14-8411-4d80c9ed2911-iptables-alerter-script\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-node-log\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.958737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-node-log\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-log-socket\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/981aa28d-2a57-4f14-8411-4d80c9ed2911-host-slash\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e-agent-certs\") pod \"konnectivity-agent-rw7pj\" (UID: \"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e\") " pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.957992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-socket-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/981aa28d-2a57-4f14-8411-4d80c9ed2911-host-slash\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sc4x\" (UniqueName: \"kubernetes.io/projected/71299419-e249-4660-891c-24ba490f5c36-kube-api-access-4sc4x\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-ovn\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfb26da3-8175-4742-a038-7b5d5d082af2-host\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-cni-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71299419-e249-4660-891c-24ba490f5c36-tmp-dir\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.958150 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovnkube-script-lib\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-systemd\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovnkube-config\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kml2r\" (UniqueName: \"kubernetes.io/projected/0a58254f-d46b-4b42-b89e-5f65cdf19d34-kube-api-access-kml2r\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:33.959479 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.958243 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:10:34.45819958 +0000 UTC m=+3.053770646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-socket-dir\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-cnibin\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-k8s-cni-cncf-io\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e-konnectivity-ca\") pod \"konnectivity-agent-rw7pj\" (UID: \"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e\") " pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-env-overrides\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-run-ovn\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0bdcacfa-6992-4542-8cbe-df76abfeb25b-log-socket\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmvqf\" (UniqueName: \"kubernetes.io/projected/0bdcacfa-6992-4542-8cbe-df76abfeb25b-kube-api-access-bmvqf\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysconfig\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-systemd\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.958863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-env-overrides\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.959012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovnkube-script-lib\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.959100 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e-konnectivity-ca\") pod \"konnectivity-agent-rw7pj\" (UID: \"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e\") " pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.959130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/71299419-e249-4660-891c-24ba490f5c36-tmp-dir\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.959176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-sysconfig\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.959637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/85bafbc7-2166-40e1-825d-81c20339ab1e-etc-tuned\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.960251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.960139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/85bafbc7-2166-40e1-825d-81c20339ab1e-tmp\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.961517 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.961481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0bdcacfa-6992-4542-8cbe-df76abfeb25b-ovn-node-metrics-cert\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.961777 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.961753 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e-agent-certs\") pod \"konnectivity-agent-rw7pj\" (UID: \"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e\") " pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:33.974281 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.972290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4jwz\" (UniqueName: \"kubernetes.io/projected/2be9773d-32f2-4dbe-a3cf-557fd268ad7d-kube-api-access-z4jwz\") pod \"aws-ebs-csi-driver-node-ksrsc\" (UID: \"2be9773d-32f2-4dbe-a3cf-557fd268ad7d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:33.974281 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.974208 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmvqf\" (UniqueName: \"kubernetes.io/projected/0bdcacfa-6992-4542-8cbe-df76abfeb25b-kube-api-access-bmvqf\") pod \"ovnkube-node-mbnjg\" (UID: \"0bdcacfa-6992-4542-8cbe-df76abfeb25b\") " pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:33.974851 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.974826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5xrg\" (UniqueName: \"kubernetes.io/projected/981aa28d-2a57-4f14-8411-4d80c9ed2911-kube-api-access-t5xrg\") pod \"iptables-alerter-rjhlf\" (UID: \"981aa28d-2a57-4f14-8411-4d80c9ed2911\") " pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:33.975948 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.975927 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:33.975948 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.975948 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:33.976083 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.975960 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:33.976083 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:33.976022 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:34.476005617 +0000 UTC m=+3.071576472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:33.979641 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.979598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwzgw\" (UniqueName: \"kubernetes.io/projected/a17c6c3f-25ab-4414-92a4-946230c882ea-kube-api-access-cwzgw\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:33.979768 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.979750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bhw\" (UniqueName: \"kubernetes.io/projected/85bafbc7-2166-40e1-825d-81c20339ab1e-kube-api-access-54bhw\") pod \"tuned-h47wt\" (UID: \"85bafbc7-2166-40e1-825d-81c20339ab1e\") " pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:33.979850 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.979803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" event={"ID":"f4a1f42e43969c311a11e30a08e26ff0","Type":"ContainerStarted","Data":"a461d3a869be1929a868222288aa4aa966a195220aa20a2dcfdc389cf80ecea8"} Apr 21 15:10:33.981416 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.981392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sc4x\" (UniqueName: \"kubernetes.io/projected/71299419-e249-4660-891c-24ba490f5c36-kube-api-access-4sc4x\") pod \"node-resolver-hdbc9\" (UID: \"71299419-e249-4660-891c-24ba490f5c36\") " pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:33.981895 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:33.981873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" event={"ID":"f339e642e0590241739bde2677ce9dbf","Type":"ContainerStarted","Data":"acf0c6f264387c7cbb4256beaf50af15e019d2101b94a03f819b60ae9bff8da4"} Apr 21 15:10:34.058957 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.058917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kml2r\" (UniqueName: \"kubernetes.io/projected/0a58254f-d46b-4b42-b89e-5f65cdf19d34-kube-api-access-kml2r\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.058957 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.058961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-cnibin\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059176 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-cnibin\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059176 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-k8s-cni-cncf-io\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059176 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-system-cni-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfb26da3-8175-4742-a038-7b5d5d082af2-serviceca\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-cni-bin\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-k8s-cni-cncf-io\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059243 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-system-cni-dir\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-os-release\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-system-cni-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-cni-bin\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-kubelet\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059315 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-conf-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-conf-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-system-cni-dir\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-os-release\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-kubelet\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cnibin\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1596851-f71d-43c8-b7f4-f92f0a29bb06-cni-binary-copy\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-hostroot\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cnibin\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-socket-dir-parent\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059617 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-hostroot\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-multus-certs\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqp9j\" (UniqueName: \"kubernetes.io/projected/cfb26da3-8175-4742-a038-7b5d5d082af2-kube-api-access-tqp9j\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-cni-multus\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-etc-kubernetes\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.059713 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/cfb26da3-8175-4742-a038-7b5d5d082af2-serviceca\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58tk4\" (UniqueName: \"kubernetes.io/projected/c1596851-f71d-43c8-b7f4-f92f0a29bb06-kube-api-access-58tk4\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-os-release\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-netns\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-daemon-config\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-socket-dir-parent\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfb26da3-8175-4742-a038-7b5d5d082af2-host\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.059938 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-cni-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060025 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-cni-dir\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1596851-f71d-43c8-b7f4-f92f0a29bb06-cni-binary-copy\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-multus-certs\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.060153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-os-release\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a58254f-d46b-4b42-b89e-5f65cdf19d34-tuning-conf-dir\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-var-lib-cni-multus\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-etc-kubernetes\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cfb26da3-8175-4742-a038-7b5d5d082af2-host\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c1596851-f71d-43c8-b7f4-f92f0a29bb06-host-run-netns\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.060686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c1596851-f71d-43c8-b7f4-f92f0a29bb06-multus-daemon-config\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.061154 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.060794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a58254f-d46b-4b42-b89e-5f65cdf19d34-cni-binary-copy\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.068476 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.068418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kml2r\" (UniqueName: \"kubernetes.io/projected/0a58254f-d46b-4b42-b89e-5f65cdf19d34-kube-api-access-kml2r\") pod \"multus-additional-cni-plugins-m4ccx\" (UID: \"0a58254f-d46b-4b42-b89e-5f65cdf19d34\") " pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.068620 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.068600 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqp9j\" (UniqueName: \"kubernetes.io/projected/cfb26da3-8175-4742-a038-7b5d5d082af2-kube-api-access-tqp9j\") pod \"node-ca-4hwvs\" (UID: \"cfb26da3-8175-4742-a038-7b5d5d082af2\") " pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.068735 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.068712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tk4\" (UniqueName: \"kubernetes.io/projected/c1596851-f71d-43c8-b7f4-f92f0a29bb06-kube-api-access-58tk4\") pod \"multus-jkh68\" (UID: \"c1596851-f71d-43c8-b7f4-f92f0a29bb06\") " pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.144712 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.144671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:34.151895 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.151868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hdbc9" Apr 21 15:10:34.161621 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.161596 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rjhlf" Apr 21 15:10:34.167325 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.167294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:34.177004 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.176980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" Apr 21 15:10:34.183561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.183541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h47wt" Apr 21 15:10:34.190097 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.190074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4hwvs" Apr 21 15:10:34.196991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.196969 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jkh68" Apr 21 15:10:34.212668 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.212647 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" Apr 21 15:10:34.294297 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.294262 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 15:10:34.463232 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.463164 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:34.463359 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:34.463293 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:34.463359 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:34.463349 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:10:35.463334207 +0000 UTC m=+4.058905065 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:34.564461 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.564416 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:34.564630 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:34.564580 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:34.564630 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:34.564602 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:34.564630 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:34.564617 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:34.564787 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:34.564690 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:35.564666575 +0000 UTC m=+4.160237432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:34.718071 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.718039 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71299419_e249_4660_891c_24ba490f5c36.slice/crio-9aaf1a765233d40ee271b9acdbb76f0ee76950adb043b5b300ffe97941e55921 WatchSource:0}: Error finding container 9aaf1a765233d40ee271b9acdbb76f0ee76950adb043b5b300ffe97941e55921: Status 404 returned error can't find the container with id 9aaf1a765233d40ee271b9acdbb76f0ee76950adb043b5b300ffe97941e55921 Apr 21 15:10:34.720530 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.720509 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod981aa28d_2a57_4f14_8411_4d80c9ed2911.slice/crio-52c865e7666bc3f61f470bd49a9aa40b4dae596d254c6175a56555eb73282a3a WatchSource:0}: Error finding container 52c865e7666bc3f61f470bd49a9aa40b4dae596d254c6175a56555eb73282a3a: Status 404 returned error can't find the container with id 52c865e7666bc3f61f470bd49a9aa40b4dae596d254c6175a56555eb73282a3a Apr 21 15:10:34.721336 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.721314 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdcacfa_6992_4542_8cbe_df76abfeb25b.slice/crio-fcfe87fcb744d5ff8c8a3e11591b1fec34638c15daca655512d93e9b6859b0e7 WatchSource:0}: Error finding container fcfe87fcb744d5ff8c8a3e11591b1fec34638c15daca655512d93e9b6859b0e7: Status 404 returned error can't find the container with id fcfe87fcb744d5ff8c8a3e11591b1fec34638c15daca655512d93e9b6859b0e7 Apr 21 15:10:34.743012 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.742987 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bafbc7_2166_40e1_825d_81c20339ab1e.slice/crio-a6e7b5108848fc9f97742d03941994a9a43310e220c6c71c65fba569e0d78795 WatchSource:0}: Error finding container a6e7b5108848fc9f97742d03941994a9a43310e220c6c71c65fba569e0d78795: Status 404 returned error can't find the container with id a6e7b5108848fc9f97742d03941994a9a43310e220c6c71c65fba569e0d78795 Apr 21 15:10:34.743708 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.743668 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a58254f_d46b_4b42_b89e_5f65cdf19d34.slice/crio-22c7b2b68832ee2f2038f34145d49f4cfd39c81276a6fff63ecc3db30d275e40 WatchSource:0}: Error finding container 22c7b2b68832ee2f2038f34145d49f4cfd39c81276a6fff63ecc3db30d275e40: Status 404 returned error can't find the container with id 22c7b2b68832ee2f2038f34145d49f4cfd39c81276a6fff63ecc3db30d275e40 Apr 21 15:10:34.744297 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.744264 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2be9773d_32f2_4dbe_a3cf_557fd268ad7d.slice/crio-c4b4fcf2633df1d37e2eba4e122c1165140f9c4c0cd5f06acc159cdfb6230865 WatchSource:0}: Error finding container c4b4fcf2633df1d37e2eba4e122c1165140f9c4c0cd5f06acc159cdfb6230865: Status 404 returned error can't find the container with id c4b4fcf2633df1d37e2eba4e122c1165140f9c4c0cd5f06acc159cdfb6230865 Apr 21 15:10:34.745251 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.745152 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1596851_f71d_43c8_b7f4_f92f0a29bb06.slice/crio-3e7a52bb99932112ab1c8f88d8632a8975e952a93770ee5981c776da9098cd63 WatchSource:0}: Error finding container 3e7a52bb99932112ab1c8f88d8632a8975e952a93770ee5981c776da9098cd63: Status 404 returned error can't find the container with id 3e7a52bb99932112ab1c8f88d8632a8975e952a93770ee5981c776da9098cd63 Apr 21 15:10:34.746537 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.746520 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4c8c1d_205b_43ec_9a00_3a7d4989ef9e.slice/crio-2079ac27252dbb4c406b41046dcc8d4d43ba543c5cbe3cc1426c93c9c6b7ead8 WatchSource:0}: Error finding container 2079ac27252dbb4c406b41046dcc8d4d43ba543c5cbe3cc1426c93c9c6b7ead8: Status 404 returned error can't find the container with id 2079ac27252dbb4c406b41046dcc8d4d43ba543c5cbe3cc1426c93c9c6b7ead8 Apr 21 15:10:34.747272 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:10:34.747248 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb26da3_8175_4742_a038_7b5d5d082af2.slice/crio-fc3aac21cb686e4532b8b0731b5630b812a42a4456640a784ec45907c242eb80 WatchSource:0}: Error finding container fc3aac21cb686e4532b8b0731b5630b812a42a4456640a784ec45907c242eb80: Status 404 returned error can't find the container with id fc3aac21cb686e4532b8b0731b5630b812a42a4456640a784ec45907c242eb80 Apr 21 15:10:34.880281 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.880249 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:05:32 +0000 UTC" deadline="2028-01-17 10:33:37.890082658 +0000 UTC" Apr 21 15:10:34.880281 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.880275 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15259h23m3.009809586s" Apr 21 15:10:34.985224 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.985150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rjhlf" event={"ID":"981aa28d-2a57-4f14-8411-4d80c9ed2911","Type":"ContainerStarted","Data":"52c865e7666bc3f61f470bd49a9aa40b4dae596d254c6175a56555eb73282a3a"} Apr 21 15:10:34.986818 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.986795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" event={"ID":"f339e642e0590241739bde2677ce9dbf","Type":"ContainerStarted","Data":"79eabbf1904f3387063cfff8b7aeebb95635201f88d2becc70b8709a8dfef71b"} Apr 21 15:10:34.988142 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.988121 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4hwvs" event={"ID":"cfb26da3-8175-4742-a038-7b5d5d082af2","Type":"ContainerStarted","Data":"fc3aac21cb686e4532b8b0731b5630b812a42a4456640a784ec45907c242eb80"} Apr 21 15:10:34.989117 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.989031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rw7pj" event={"ID":"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e","Type":"ContainerStarted","Data":"2079ac27252dbb4c406b41046dcc8d4d43ba543c5cbe3cc1426c93c9c6b7ead8"} Apr 21 15:10:34.990261 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.990231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jkh68" event={"ID":"c1596851-f71d-43c8-b7f4-f92f0a29bb06","Type":"ContainerStarted","Data":"3e7a52bb99932112ab1c8f88d8632a8975e952a93770ee5981c776da9098cd63"} Apr 21 15:10:34.991145 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.991128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerStarted","Data":"22c7b2b68832ee2f2038f34145d49f4cfd39c81276a6fff63ecc3db30d275e40"} Apr 21 15:10:34.992089 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.992070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h47wt" event={"ID":"85bafbc7-2166-40e1-825d-81c20339ab1e","Type":"ContainerStarted","Data":"a6e7b5108848fc9f97742d03941994a9a43310e220c6c71c65fba569e0d78795"} Apr 21 15:10:34.993080 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.993062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hdbc9" event={"ID":"71299419-e249-4660-891c-24ba490f5c36","Type":"ContainerStarted","Data":"9aaf1a765233d40ee271b9acdbb76f0ee76950adb043b5b300ffe97941e55921"} Apr 21 15:10:34.994028 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.993999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"fcfe87fcb744d5ff8c8a3e11591b1fec34638c15daca655512d93e9b6859b0e7"} Apr 21 15:10:34.995080 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:34.995057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" event={"ID":"2be9773d-32f2-4dbe-a3cf-557fd268ad7d","Type":"ContainerStarted","Data":"c4b4fcf2633df1d37e2eba4e122c1165140f9c4c0cd5f06acc159cdfb6230865"} Apr 21 15:10:35.010260 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.010218 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-11.ec2.internal" podStartSLOduration=2.01020773 podStartE2EDuration="2.01020773s" podCreationTimestamp="2026-04-21 15:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:35.009802949 +0000 UTC m=+3.605373824" watchObservedRunningTime="2026-04-21 15:10:35.01020773 +0000 UTC m=+3.605778605" Apr 21 15:10:35.034723 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.034701 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-p62vr"] Apr 21 15:10:35.037328 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.037307 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.037449 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.037400 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:35.069088 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.069049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.069260 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.069136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01bee1dc-5579-465f-b2a5-53d7e0d89cae-kubelet-config\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.069260 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.069171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01bee1dc-5579-465f-b2a5-53d7e0d89cae-dbus\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.169533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.169586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01bee1dc-5579-465f-b2a5-53d7e0d89cae-kubelet-config\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.169602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01bee1dc-5579-465f-b2a5-53d7e0d89cae-dbus\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.169755 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01bee1dc-5579-465f-b2a5-53d7e0d89cae-dbus\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.169853 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.169906 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret podName:01bee1dc-5579-465f-b2a5-53d7e0d89cae nodeName:}" failed. No retries permitted until 2026-04-21 15:10:35.669888101 +0000 UTC m=+4.265458960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret") pod "global-pull-secret-syncer-p62vr" (UID: "01bee1dc-5579-465f-b2a5-53d7e0d89cae") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:35.170261 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.170168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01bee1dc-5579-465f-b2a5-53d7e0d89cae-kubelet-config\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.473303 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.473218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:35.473489 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.473383 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:35.473489 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.473452 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.473431789 +0000 UTC m=+6.069002647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:35.574805 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.574661 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:35.574912 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.574889 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:35.574963 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.574918 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:35.574963 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.574932 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:35.575050 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.574995 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:37.57497578 +0000 UTC m=+6.170546648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:35.676177 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.676136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:35.676332 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.676296 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:35.676438 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.676360 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret podName:01bee1dc-5579-465f-b2a5-53d7e0d89cae nodeName:}" failed. No retries permitted until 2026-04-21 15:10:36.67634183 +0000 UTC m=+5.271912711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret") pod "global-pull-secret-syncer-p62vr" (UID: "01bee1dc-5579-465f-b2a5-53d7e0d89cae") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:35.975469 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.975434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:35.975912 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:35.975434 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:35.975912 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.975587 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:35.975912 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:35.975620 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:36.007451 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:36.007333 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4a1f42e43969c311a11e30a08e26ff0" containerID="23bd6f3d4b08300fa040b8c4343c3130e2f8c60b320ff970d7b70facab447a3d" exitCode=0 Apr 21 15:10:36.008268 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:36.008242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" event={"ID":"f4a1f42e43969c311a11e30a08e26ff0","Type":"ContainerDied","Data":"23bd6f3d4b08300fa040b8c4343c3130e2f8c60b320ff970d7b70facab447a3d"} Apr 21 15:10:36.685764 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:36.685679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:36.685918 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:36.685859 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:36.685918 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:36.685917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret podName:01bee1dc-5579-465f-b2a5-53d7e0d89cae nodeName:}" failed. No retries permitted until 2026-04-21 15:10:38.685898864 +0000 UTC m=+7.281469719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret") pod "global-pull-secret-syncer-p62vr" (UID: "01bee1dc-5579-465f-b2a5-53d7e0d89cae") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:36.975528 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:36.975439 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:36.975959 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:36.975596 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:37.027257 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:37.027176 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" event={"ID":"f4a1f42e43969c311a11e30a08e26ff0","Type":"ContainerStarted","Data":"44aeac6bd871fb450d990cf810047f3ca0e4006338c84cfc9237e78b47d760ae"} Apr 21 15:10:37.493549 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:37.493510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:37.493723 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.493680 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.493779 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.493745 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:10:41.49372638 +0000 UTC m=+10.089297239 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:37.594532 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:37.594446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:37.594683 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.594583 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:37.594683 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.594603 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:37.594683 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.594616 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.594683 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.594680 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:41.594661145 +0000 UTC m=+10.190232002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:37.977563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:37.975499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:37.977563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:37.975546 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:37.977563 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.975626 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:37.977563 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:37.975740 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:38.704963 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:38.704926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:38.705151 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:38.705075 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:38.705151 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:38.705125 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret podName:01bee1dc-5579-465f-b2a5-53d7e0d89cae nodeName:}" failed. No retries permitted until 2026-04-21 15:10:42.705111956 +0000 UTC m=+11.300682813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret") pod "global-pull-secret-syncer-p62vr" (UID: "01bee1dc-5579-465f-b2a5-53d7e0d89cae") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:38.975268 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:38.975190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:38.975442 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:38.975390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:39.976449 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:39.975893 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:39.976449 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:39.976010 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:39.976449 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:39.975903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:39.976449 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:39.976402 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:40.975974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:40.975462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:40.975974 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:40.975603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:41.528622 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:41.528580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:41.529050 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.528741 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:41.529050 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.528827 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:10:49.528804498 +0000 UTC m=+18.124375366 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:41.630002 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:41.629877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:41.630139 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.630044 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:41.630139 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.630068 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:41.630139 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.630079 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:41.630139 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.630124 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:10:49.630110489 +0000 UTC m=+18.225681341 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:41.978366 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:41.977646 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:41.978366 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.977771 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:41.978366 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:41.978163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:41.978366 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:41.978264 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:42.740069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:42.740029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:42.740541 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:42.740231 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:42.740541 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:42.740291 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret podName:01bee1dc-5579-465f-b2a5-53d7e0d89cae nodeName:}" failed. No retries permitted until 2026-04-21 15:10:50.740273357 +0000 UTC m=+19.335844226 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret") pod "global-pull-secret-syncer-p62vr" (UID: "01bee1dc-5579-465f-b2a5-53d7e0d89cae") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:42.974984 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:42.974947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:42.975175 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:42.975080 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:43.975223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:43.975184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:43.975690 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:43.975189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:43.975690 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:43.975306 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:43.975690 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:43.975392 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:44.975983 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:44.975943 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:44.976402 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:44.976078 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:45.975509 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:45.975464 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:45.975684 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:45.975608 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:45.975684 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:45.975656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:45.977407 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:45.976575 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:46.975233 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:46.975193 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:46.975425 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:46.975329 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:47.975991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:47.975958 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:47.976325 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:47.976008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:47.976325 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:47.976088 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:47.976325 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:47.976172 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:48.975005 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:48.974971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:48.975206 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:48.975116 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:49.586364 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:49.586320 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:49.586845 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.586504 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:49.586845 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.586588 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:11:05.586567926 +0000 UTC m=+34.182138795 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:10:49.687731 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:49.687700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:49.687909 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.687888 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:10:49.687952 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.687920 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:10:49.687952 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.687933 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:49.688042 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.687997 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:05.687977298 +0000 UTC m=+34.283548171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:10:49.975730 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:49.975700 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:49.975889 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:49.975701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:49.975889 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.975841 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:49.975991 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:49.975901 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:50.796802 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:50.796751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:50.797276 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:50.796922 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:50.797276 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:50.797010 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret podName:01bee1dc-5579-465f-b2a5-53d7e0d89cae nodeName:}" failed. No retries permitted until 2026-04-21 15:11:06.796988848 +0000 UTC m=+35.392559724 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret") pod "global-pull-secret-syncer-p62vr" (UID: "01bee1dc-5579-465f-b2a5-53d7e0d89cae") : object "kube-system"/"original-pull-secret" not registered Apr 21 15:10:50.975144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:50.975110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:50.975321 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:50.975226 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:51.975987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:51.975947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:51.976512 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:51.976053 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:51.976512 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:51.976097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:51.976512 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:51.976137 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:52.975192 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:52.974977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:52.975360 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:52.975328 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:53.055185 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.053662 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4hwvs" event={"ID":"cfb26da3-8175-4742-a038-7b5d5d082af2","Type":"ContainerStarted","Data":"fc25bfa5fac59220d6f3e936b36331992b99745316ceab42524dbc82fb12c4cf"} Apr 21 15:10:53.056505 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.056472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rw7pj" event={"ID":"3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e","Type":"ContainerStarted","Data":"7a5e5b8f3a3127f79987c9461a142e119e5e1ddf8ad6a934d999cdf9456f8407"} Apr 21 15:10:53.057901 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.057870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jkh68" event={"ID":"c1596851-f71d-43c8-b7f4-f92f0a29bb06","Type":"ContainerStarted","Data":"2893bbd14025da4387b1e64b094c7dfeb21a8d80aac83b7f21b212797a3aa6ff"} Apr 21 15:10:53.059161 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.059139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerStarted","Data":"e0507051e1ef8cb47e6fafaae293473d497b6d46f57d20613e23782a710196a5"} Apr 21 15:10:53.060498 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.060477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h47wt" event={"ID":"85bafbc7-2166-40e1-825d-81c20339ab1e","Type":"ContainerStarted","Data":"18dfb4bde2e524ca1ec37eae05127b2544f418e0a20a2992bceabd4bdd74b282"} Apr 21 15:10:53.061755 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.061733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hdbc9" event={"ID":"71299419-e249-4660-891c-24ba490f5c36","Type":"ContainerStarted","Data":"f10466a752d3fb10963e60cc2f0e7306be00006c651362337bf44949ae560f05"} Apr 21 15:10:53.063531 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.063512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:10:53.063848 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.063820 2575 generic.go:358] "Generic (PLEG): container finished" podID="0bdcacfa-6992-4542-8cbe-df76abfeb25b" containerID="7f898a6bdd18d706efa191d1ed14bde2d58609abc892acea971096cb19a1e0c0" exitCode=1 Apr 21 15:10:53.063970 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.063894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"8fc1bd57a8fc624ded2e8db10a83f08cfa47d01f0bc8f0f9c930cae992254f36"} Apr 21 15:10:53.063970 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.063921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"2b225f30c8041d04603f736da7538eae5af1841cec2c529dd2003a89bb60a838"} Apr 21 15:10:53.063970 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.063935 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerDied","Data":"7f898a6bdd18d706efa191d1ed14bde2d58609abc892acea971096cb19a1e0c0"} Apr 21 15:10:53.063970 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.063950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"7f75a1d223468e6172c88f9a61ca2c1093a566791db0b692b721bee855b6bc76"} Apr 21 15:10:53.065593 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.065574 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" event={"ID":"2be9773d-32f2-4dbe-a3cf-557fd268ad7d","Type":"ContainerStarted","Data":"142cdfed2eb2fd67ca8c2c7bb95ef8aa91fb8e26885f51f062aba658372bf192"} Apr 21 15:10:53.066567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.066533 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4hwvs" podStartSLOduration=3.183938353 podStartE2EDuration="21.066523558s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.749288177 +0000 UTC m=+3.344859029" lastFinishedPulling="2026-04-21 15:10:52.63187338 +0000 UTC m=+21.227444234" observedRunningTime="2026-04-21 15:10:53.066010218 +0000 UTC m=+21.661581096" watchObservedRunningTime="2026-04-21 15:10:53.066523558 +0000 UTC m=+21.662094433" Apr 21 15:10:53.066622 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.066599 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-11.ec2.internal" podStartSLOduration=20.066596236 podStartE2EDuration="20.066596236s" podCreationTimestamp="2026-04-21 15:10:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:10:37.041256446 +0000 UTC m=+5.636827322" watchObservedRunningTime="2026-04-21 15:10:53.066596236 +0000 UTC m=+21.662167110" Apr 21 15:10:53.092183 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.092133 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h47wt" podStartSLOduration=3.205262378 podStartE2EDuration="21.092118647s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.744951768 +0000 UTC m=+3.340522621" lastFinishedPulling="2026-04-21 15:10:52.631808031 +0000 UTC m=+21.227378890" observedRunningTime="2026-04-21 15:10:53.079085176 +0000 UTC m=+21.674656051" watchObservedRunningTime="2026-04-21 15:10:53.092118647 +0000 UTC m=+21.687689523" Apr 21 15:10:53.092642 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.092609 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hdbc9" podStartSLOduration=3.180725926 podStartE2EDuration="21.092599739s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.719958268 +0000 UTC m=+3.315529120" lastFinishedPulling="2026-04-21 15:10:52.631832079 +0000 UTC m=+21.227402933" observedRunningTime="2026-04-21 15:10:53.092309619 +0000 UTC m=+21.687880496" watchObservedRunningTime="2026-04-21 15:10:53.092599739 +0000 UTC m=+21.688170615" Apr 21 15:10:53.106635 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.106593 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jkh68" podStartSLOduration=3.192735975 podStartE2EDuration="21.106581078s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.749185108 +0000 UTC m=+3.344755965" lastFinishedPulling="2026-04-21 15:10:52.663030215 +0000 UTC m=+21.258601068" observedRunningTime="2026-04-21 15:10:53.106126191 +0000 UTC m=+21.701697077" watchObservedRunningTime="2026-04-21 15:10:53.106581078 +0000 UTC m=+21.702151954" Apr 21 15:10:53.145489 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.145388 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rw7pj" podStartSLOduration=3.262328643 podStartE2EDuration="21.145360246s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.748793661 +0000 UTC m=+3.344364517" lastFinishedPulling="2026-04-21 15:10:52.631825264 +0000 UTC m=+21.227396120" observedRunningTime="2026-04-21 15:10:53.14505295 +0000 UTC m=+21.740623821" watchObservedRunningTime="2026-04-21 15:10:53.145360246 +0000 UTC m=+21.740931112" Apr 21 15:10:53.975657 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.975621 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:53.975868 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:53.975638 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:53.975868 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:53.975749 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:53.975868 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:53.975799 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:54.032728 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.032697 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 15:10:54.068567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.068539 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a58254f-d46b-4b42-b89e-5f65cdf19d34" containerID="e0507051e1ef8cb47e6fafaae293473d497b6d46f57d20613e23782a710196a5" exitCode=0 Apr 21 15:10:54.068968 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.068615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerDied","Data":"e0507051e1ef8cb47e6fafaae293473d497b6d46f57d20613e23782a710196a5"} Apr 21 15:10:54.071020 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.071006 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:10:54.071396 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.071358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"c8a1d1189aaf480efb5e9c4bd84b06a13250a20b06041c330f673fddfa05dc8b"} Apr 21 15:10:54.071484 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.071406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"5f6ea5edee2686a6273fc244f7ac8f7a7676a87171fca98bd566c5c8386fad2b"} Apr 21 15:10:54.072920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.072897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" event={"ID":"2be9773d-32f2-4dbe-a3cf-557fd268ad7d","Type":"ContainerStarted","Data":"be0adb6ada9b3ba8c4c0a6de684d5aa5ee16250a7fa1824614f0e4592ca341df"} Apr 21 15:10:54.074190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.074169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rjhlf" event={"ID":"981aa28d-2a57-4f14-8411-4d80c9ed2911","Type":"ContainerStarted","Data":"240532e0b2245751d549e2a889063e68d71e7961be90b1b544ed767a4a5fd574"} Apr 21 15:10:54.122103 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.122059 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rjhlf" podStartSLOduration=4.231631788 podStartE2EDuration="22.122048431s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.741437793 +0000 UTC m=+3.337008646" lastFinishedPulling="2026-04-21 15:10:52.631854435 +0000 UTC m=+21.227425289" observedRunningTime="2026-04-21 15:10:54.121889412 +0000 UTC m=+22.717460287" watchObservedRunningTime="2026-04-21 15:10:54.122048431 +0000 UTC m=+22.717619305" Apr 21 15:10:54.277651 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.277572 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:54.278140 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.278123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:54.916700 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.916524 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T15:10:54.032720961Z","UUID":"aae0026a-dd8f-40dc-aef2-c545e57d805c","Handler":null,"Name":"","Endpoint":""} Apr 21 15:10:54.919715 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.919688 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 15:10:54.919715 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.919717 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 15:10:54.975680 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:54.975647 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:54.975844 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:54.975769 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:55.077108 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:55.077077 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:55.077590 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:55.077568 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rw7pj" Apr 21 15:10:55.975666 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:55.975296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:55.975666 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:55.975353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:55.975905 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:55.975673 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:55.975905 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:55.975764 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:56.082134 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:56.082094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:10:56.082528 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:56.082477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"a75db053003a1e2f650c3faa9eb6e4770cb5aba0f1ed6e3ec591407a9f15b494"} Apr 21 15:10:56.084707 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:56.084680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" event={"ID":"2be9773d-32f2-4dbe-a3cf-557fd268ad7d","Type":"ContainerStarted","Data":"b9ee0a4355df2ec1587cea36f86bdfb0e5239ba018a744b67f400d924810becd"} Apr 21 15:10:56.975407 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:56.975358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:56.975585 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:56.975497 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:57.975853 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:57.975815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:57.975853 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:57.975837 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:57.976432 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:57.975945 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:10:57.976432 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:57.976025 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:58.091325 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:58.091303 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:10:58.976066 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:58.975891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:10:58.976579 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:58.976134 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:10:59.094906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.094876 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a58254f-d46b-4b42-b89e-5f65cdf19d34" containerID="3fee0e8e88cd166439e411f26202197ab9fdef1aa047a1a432be88a55dfadd15" exitCode=0 Apr 21 15:10:59.095065 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.094957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerDied","Data":"3fee0e8e88cd166439e411f26202197ab9fdef1aa047a1a432be88a55dfadd15"} Apr 21 15:10:59.097891 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.097872 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:10:59.098234 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.098213 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"7414458db0563f6036315ae87ce21c0530787c052ad696c6736ec4f1a74119f1"} Apr 21 15:10:59.098603 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.098588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:59.098712 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.098698 2575 scope.go:117] "RemoveContainer" containerID="7f898a6bdd18d706efa191d1ed14bde2d58609abc892acea971096cb19a1e0c0" Apr 21 15:10:59.114237 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.114218 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:10:59.120887 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.120851 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ksrsc" podStartSLOduration=6.726535738 podStartE2EDuration="27.120839948s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.748926751 +0000 UTC m=+3.344497607" lastFinishedPulling="2026-04-21 15:10:55.143230946 +0000 UTC m=+23.738801817" observedRunningTime="2026-04-21 15:10:56.106292059 +0000 UTC m=+24.701862934" watchObservedRunningTime="2026-04-21 15:10:59.120839948 +0000 UTC m=+27.716410823" Apr 21 15:10:59.975534 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.975452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:10:59.975712 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:59.975556 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:10:59.975712 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:10:59.975610 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:10:59.975837 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:10:59.975707 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:11:00.022112 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.022078 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p62vr"] Apr 21 15:11:00.022517 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.022179 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:00.022517 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:00.022261 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:11:00.025312 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.025288 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hfcw6"] Apr 21 15:11:00.026016 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.025995 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-96snf"] Apr 21 15:11:00.101311 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.101284 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a58254f-d46b-4b42-b89e-5f65cdf19d34" containerID="6aae255cb6cc8002e9b92c39f462d4cc250d9be26810266d513444a208a7214e" exitCode=0 Apr 21 15:11:00.101483 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.101348 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerDied","Data":"6aae255cb6cc8002e9b92c39f462d4cc250d9be26810266d513444a208a7214e"} Apr 21 15:11:00.104803 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.104785 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:11:00.105150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.105132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:11:00.105150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.105140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" event={"ID":"0bdcacfa-6992-4542-8cbe-df76abfeb25b","Type":"ContainerStarted","Data":"ff912d0d7d908cff739e5354df85b4505fe43931227ac467159f03c02323acd8"} Apr 21 15:11:00.105260 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.105235 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 15:11:00.105260 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:00.105246 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:11:00.105410 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.105392 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:00.105518 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.105411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:11:00.105518 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:00.105480 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:11:00.120003 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.119986 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:11:00.162149 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.162020 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" podStartSLOduration=10.193220823 podStartE2EDuration="28.162009861s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.741680502 +0000 UTC m=+3.337251355" lastFinishedPulling="2026-04-21 15:10:52.710469541 +0000 UTC m=+21.306040393" observedRunningTime="2026-04-21 15:11:00.160658777 +0000 UTC m=+28.756229652" watchObservedRunningTime="2026-04-21 15:11:00.162009861 +0000 UTC m=+28.757580779" Apr 21 15:11:00.835749 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:00.835717 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:11:01.109354 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:01.109323 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a58254f-d46b-4b42-b89e-5f65cdf19d34" containerID="eaa19547ae52acfc29c018df3482e54490a17d9272824913489f370f623651b0" exitCode=0 Apr 21 15:11:01.109728 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:01.109410 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerDied","Data":"eaa19547ae52acfc29c018df3482e54490a17d9272824913489f370f623651b0"} Apr 21 15:11:01.977214 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:01.976992 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:01.977426 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:01.977068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:01.977426 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:01.977309 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:11:01.977426 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:01.977406 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:11:01.977603 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:01.977096 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:11:01.977603 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:01.977511 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:11:03.975503 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:03.975463 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:03.976310 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:03.975590 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:03.976310 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:03.975603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-hfcw6" podUID="9562db30-abde-4e93-96e5-77429f548f83" Apr 21 15:11:03.976310 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:03.975619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:11:03.976310 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:03.975701 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-p62vr" podUID="01bee1dc-5579-465f-b2a5-53d7e0d89cae" Apr 21 15:11:03.976310 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:03.975767 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:11:05.620298 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.620214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:11:05.620817 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:05.620341 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:11:05.620817 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:05.620414 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:11:37.620399634 +0000 UTC m=+66.215970487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 15:11:05.721660 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.721532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:05.721848 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:05.721708 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 15:11:05.721848 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:05.721734 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 15:11:05.721848 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:05.721748 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zxf5g for pod openshift-network-diagnostics/network-check-target-hfcw6: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:11:05.721848 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:05.721820 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g podName:9562db30-abde-4e93-96e5-77429f548f83 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:37.721799759 +0000 UTC m=+66.317370615 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxf5g" (UniqueName: "kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g") pod "network-check-target-hfcw6" (UID: "9562db30-abde-4e93-96e5-77429f548f83") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 15:11:05.769982 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.769940 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-11.ec2.internal" event="NodeReady" Apr 21 15:11:05.770123 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.770100 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 15:11:05.824663 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.824632 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-85fb9cc7f7-zl6lq"] Apr 21 15:11:05.855598 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.855367 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fr77k"] Apr 21 15:11:05.855598 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.855521 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:05.872572 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.858189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 15:11:05.883429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.882886 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 15:11:05.884665 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.883696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 15:11:05.886581 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.886556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gcqst\"" Apr 21 15:11:05.888233 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.888202 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 15:11:05.897909 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.897885 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dmzt8"] Apr 21 15:11:05.898066 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.898043 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:05.901354 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.901312 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 15:11:05.901759 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.901736 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-76666\"" Apr 21 15:11:05.901950 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.901931 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 15:11:05.916045 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.916026 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85fb9cc7f7-zl6lq"] Apr 21 15:11:05.916135 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.916049 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dmzt8"] Apr 21 15:11:05.916135 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.916058 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fr77k"] Apr 21 15:11:05.916219 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.916140 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:05.918583 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.918562 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 15:11:05.918683 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.918562 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l7xgv\"" Apr 21 15:11:05.918683 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.918610 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 15:11:05.918683 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.918611 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 15:11:05.975523 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.975491 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:05.975685 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.975612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:05.975685 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.975666 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:11:05.979914 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.979891 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:11:05.980085 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.980062 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 15:11:05.980262 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.980244 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4fs5d\"" Apr 21 15:11:05.980461 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.980407 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:11:05.980461 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.980418 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zqtzq\"" Apr 21 15:11:05.980603 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:05.980545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:11:06.023838 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.023812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-config-volume\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.023964 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.023843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.023964 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.023861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da72ba61-5c83-47ce-a285-a58cd7b77246-ca-trust-extracted\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.023964 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.023937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.024126 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.023999 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lptnd\" (UniqueName: \"kubernetes.io/projected/578cec2c-16fd-469e-931a-b7cf421795a1-kube-api-access-lptnd\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:06.024126 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-trusted-ca\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.024126 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7kmm\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-kube-api-access-n7kmm\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.024126 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgqw\" (UniqueName: \"kubernetes.io/projected/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-kube-api-access-tzgqw\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.024329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-tmp-dir\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.024329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-image-registry-private-configuration\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.024329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-bound-sa-token\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.024329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-installation-pull-secrets\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.024329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:06.024329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.024289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-certificates\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.125560 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lptnd\" (UniqueName: \"kubernetes.io/projected/578cec2c-16fd-469e-931a-b7cf421795a1-kube-api-access-lptnd\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:06.125560 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-trusted-ca\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.125560 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7kmm\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-kube-api-access-n7kmm\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.125783 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgqw\" (UniqueName: \"kubernetes.io/projected/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-kube-api-access-tzgqw\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.125939 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-tmp-dir\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.125986 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-image-registry-private-configuration\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.125986 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.125975 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-bound-sa-token\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.126047 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-installation-pull-secrets\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.126047 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126034 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:06.126117 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-certificates\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.126117 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-config-volume\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.126117 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.126117 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da72ba61-5c83-47ce-a285-a58cd7b77246-ca-trust-extracted\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.126247 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.126247 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126234 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:06.126247 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126245 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:06.126395 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:06.626277219 +0000 UTC m=+35.221848071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:06.126395 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126299 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:06.126395 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126345 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:06.626332039 +0000 UTC m=+35.221902894 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:06.126586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-tmp-dir\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.126637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-trusted-ca\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.126711 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126700 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:06.126755 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.126746 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:11:06.626734427 +0000 UTC m=+35.222305280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:06.126813 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-certificates\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.127012 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.126987 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da72ba61-5c83-47ce-a285-a58cd7b77246-ca-trust-extracted\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.127142 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.127120 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-config-volume\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.130995 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.130975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-image-registry-private-configuration\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.131307 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.131285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-installation-pull-secrets\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.137632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.137584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgqw\" (UniqueName: \"kubernetes.io/projected/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-kube-api-access-tzgqw\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.138206 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.138163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7kmm\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-kube-api-access-n7kmm\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.141560 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.141535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-bound-sa-token\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.149790 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.149769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lptnd\" (UniqueName: \"kubernetes.io/projected/578cec2c-16fd-469e-931a-b7cf421795a1-kube-api-access-lptnd\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:06.630969 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.630923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.630982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.631029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631088 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631159 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631170 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631187 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631163 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:07.63113993 +0000 UTC m=+36.226710807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631245 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:11:07.631225169 +0000 UTC m=+36.226796028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:06.631586 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:06.631263 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:07.631253379 +0000 UTC m=+36.226824232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:06.833253 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.833214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:06.835585 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.835558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01bee1dc-5579-465f-b2a5-53d7e0d89cae-original-pull-secret\") pod \"global-pull-secret-syncer-p62vr\" (UID: \"01bee1dc-5579-465f-b2a5-53d7e0d89cae\") " pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:06.888060 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:06.888000 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-p62vr" Apr 21 15:11:07.116494 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:07.116303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-p62vr"] Apr 21 15:11:07.125019 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:11:07.124986 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01bee1dc_5579_465f_b2a5_53d7e0d89cae.slice/crio-96999c3d753ab27bdc6e681940d3452d14a1af771bc8beb0e8dfd581bf7f2833 WatchSource:0}: Error finding container 96999c3d753ab27bdc6e681940d3452d14a1af771bc8beb0e8dfd581bf7f2833: Status 404 returned error can't find the container with id 96999c3d753ab27bdc6e681940d3452d14a1af771bc8beb0e8dfd581bf7f2833 Apr 21 15:11:07.639711 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:07.639679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:07.639711 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:07.639712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:07.639740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639842 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639843 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639855 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639903 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.639888123 +0000 UTC m=+38.235458977 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639925 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.639914974 +0000 UTC m=+38.235485827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639848 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:07.640180 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:07.639959 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:09.639950626 +0000 UTC m=+38.235521478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:08.127832 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:08.127796 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a58254f-d46b-4b42-b89e-5f65cdf19d34" containerID="6c8e7eb3932fd990913a489a9086accae47d5487c9ae173aa95806f483ebe8c7" exitCode=0 Apr 21 15:11:08.128021 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:08.127889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerDied","Data":"6c8e7eb3932fd990913a489a9086accae47d5487c9ae173aa95806f483ebe8c7"} Apr 21 15:11:08.129347 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:08.128933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p62vr" event={"ID":"01bee1dc-5579-465f-b2a5-53d7e0d89cae","Type":"ContainerStarted","Data":"96999c3d753ab27bdc6e681940d3452d14a1af771bc8beb0e8dfd581bf7f2833"} Apr 21 15:11:09.133518 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:09.133490 2575 generic.go:358] "Generic (PLEG): container finished" podID="0a58254f-d46b-4b42-b89e-5f65cdf19d34" containerID="be27286565f0fdae8fc837033af627f336019ddf3fd0e8655146fa6bc20bbab6" exitCode=0 Apr 21 15:11:09.133898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:09.133545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerDied","Data":"be27286565f0fdae8fc837033af627f336019ddf3fd0e8655146fa6bc20bbab6"} Apr 21 15:11:09.657443 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:09.657248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:09.657587 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:09.657454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:09.657587 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:09.657484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:09.657587 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657409 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:09.657587 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657566 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:13.657549131 +0000 UTC m=+42.253119985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:09.657587 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657580 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:09.657769 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657589 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:09.657769 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657602 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:09.657769 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657628 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:11:13.657616891 +0000 UTC m=+42.253187743 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:09.657769 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:09.657641 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:13.65763551 +0000 UTC m=+42.253206364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:10.139291 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:10.139248 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" event={"ID":"0a58254f-d46b-4b42-b89e-5f65cdf19d34","Type":"ContainerStarted","Data":"8448574f4d3fa1d67f37c9b18b88f841e1c0b3b3953bf42447fc8050d4ba932b"} Apr 21 15:11:12.010588 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:12.010527 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-m4ccx" podStartSLOduration=7.594008442 podStartE2EDuration="40.010506671s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:10:34.749049472 +0000 UTC m=+3.344620338" lastFinishedPulling="2026-04-21 15:11:07.165547699 +0000 UTC m=+35.761118567" observedRunningTime="2026-04-21 15:11:10.177624822 +0000 UTC m=+38.773195697" watchObservedRunningTime="2026-04-21 15:11:12.010506671 +0000 UTC m=+40.606077546" Apr 21 15:11:12.144546 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:12.144508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-p62vr" event={"ID":"01bee1dc-5579-465f-b2a5-53d7e0d89cae","Type":"ContainerStarted","Data":"441dcc9086d670dd5e51facd66ea8e2a9c5b4af9d1d12e2e1f8c64f85e237bc3"} Apr 21 15:11:12.170716 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:12.170671 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-p62vr" podStartSLOduration=32.751426737 podStartE2EDuration="37.170656885s" podCreationTimestamp="2026-04-21 15:10:35 +0000 UTC" firstStartedPulling="2026-04-21 15:11:07.143890711 +0000 UTC m=+35.739461564" lastFinishedPulling="2026-04-21 15:11:11.56312086 +0000 UTC m=+40.158691712" observedRunningTime="2026-04-21 15:11:12.17007508 +0000 UTC m=+40.765645957" watchObservedRunningTime="2026-04-21 15:11:12.170656885 +0000 UTC m=+40.766227749" Apr 21 15:11:13.689068 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:13.689025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:13.689068 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:13.689071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:13.689098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689172 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689203 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689214 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689215 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689236 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:21.689221803 +0000 UTC m=+50.284792655 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689253 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:21.689242197 +0000 UTC m=+50.284813050 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:13.689558 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:13.689265 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:11:21.689259452 +0000 UTC m=+50.284830305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:21.747688 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:21.747649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:21.747688 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:21.747688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:21.747717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747800 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747820 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747839 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747856 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747864 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:37.747848766 +0000 UTC m=+66.343419624 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747878 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:11:37.747872239 +0000 UTC m=+66.343443091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:21.748241 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:21.747889 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:11:37.747883476 +0000 UTC m=+66.343454328 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:32.122072 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:32.122040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mbnjg" Apr 21 15:11:37.660690 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.660651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:11:37.663608 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.663588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 15:11:37.670884 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.670867 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:11:37.670961 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.670952 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.670935628 +0000 UTC m=+130.266506481 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : secret "metrics-daemon-secret" not found Apr 21 15:11:37.761040 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.761011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:11:37.761170 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.761047 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:11:37.761170 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.761074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:11:37.761170 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.761115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:37.761170 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761140 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:11:37.761354 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761194 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:12:09.761178867 +0000 UTC m=+98.356749724 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:11:37.761354 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761195 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:11:37.761354 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761209 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:11:37.761354 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761141 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:11:37.761354 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:09.761246061 +0000 UTC m=+98.356816919 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:11:37.761354 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:11:37.761300 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:09.761278541 +0000 UTC m=+98.356849394 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:11:37.763982 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.763966 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 15:11:37.773698 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.773679 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 15:11:37.786429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.786405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxf5g\" (UniqueName: \"kubernetes.io/projected/9562db30-abde-4e93-96e5-77429f548f83-kube-api-access-zxf5g\") pod \"network-check-target-hfcw6\" (UID: \"9562db30-abde-4e93-96e5-77429f548f83\") " pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:37.803579 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.803557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4fs5d\"" Apr 21 15:11:37.811828 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.811799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:11:37.925637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:37.925565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-hfcw6"] Apr 21 15:11:37.928678 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:11:37.928653 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9562db30_abde_4e93_96e5_77429f548f83.slice/crio-a432eb67072bee58cdc997447759c9eb32f88f99f15c66f9cf110158a0ba2ba9 WatchSource:0}: Error finding container a432eb67072bee58cdc997447759c9eb32f88f99f15c66f9cf110158a0ba2ba9: Status 404 returned error can't find the container with id a432eb67072bee58cdc997447759c9eb32f88f99f15c66f9cf110158a0ba2ba9 Apr 21 15:11:38.193981 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:38.193895 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hfcw6" event={"ID":"9562db30-abde-4e93-96e5-77429f548f83","Type":"ContainerStarted","Data":"a432eb67072bee58cdc997447759c9eb32f88f99f15c66f9cf110158a0ba2ba9"} Apr 21 15:11:42.204899 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:42.204858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-hfcw6" event={"ID":"9562db30-abde-4e93-96e5-77429f548f83","Type":"ContainerStarted","Data":"8fd4982bc64e85220d4dbba0bde8dc448a43e28b3719d1152f36fb19191f3b7d"} Apr 21 15:11:42.205363 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:11:42.205010 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:12:09.788975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:09.788932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:12:09.788975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:09.788979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:09.789008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789085 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789104 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789115 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-85fb9cc7f7-zl6lq: secret "image-registry-tls" not found Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789134 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789153 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert podName:578cec2c-16fd-469e-931a-b7cf421795a1 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:13.789137928 +0000 UTC m=+162.384708782 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert") pod "ingress-canary-dmzt8" (UID: "578cec2c-16fd-469e-931a-b7cf421795a1") : secret "canary-serving-cert" not found Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789167 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls podName:da72ba61-5c83-47ce-a285-a58cd7b77246 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:13.789161342 +0000 UTC m=+162.384732195 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls") pod "image-registry-85fb9cc7f7-zl6lq" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246") : secret "image-registry-tls" not found Apr 21 15:12:09.789490 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:09.789204 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls podName:a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be nodeName:}" failed. No retries permitted until 2026-04-21 15:13:13.789184634 +0000 UTC m=+162.384755494 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls") pod "dns-default-fr77k" (UID: "a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be") : secret "dns-default-metrics-tls" not found Apr 21 15:12:13.210024 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:13.209995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-hfcw6" Apr 21 15:12:13.237115 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:13.237061 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-hfcw6" podStartSLOduration=97.929562419 podStartE2EDuration="1m41.237049192s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:11:37.9304883 +0000 UTC m=+66.526059154" lastFinishedPulling="2026-04-21 15:11:41.237975058 +0000 UTC m=+69.833545927" observedRunningTime="2026-04-21 15:11:42.233751779 +0000 UTC m=+70.829322654" watchObservedRunningTime="2026-04-21 15:12:13.237049192 +0000 UTC m=+101.832620067" Apr 21 15:12:39.556482 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.556446 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5"] Apr 21 15:12:39.559385 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.559350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.565656 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.565639 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 15:12:39.568008 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.567977 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 15:12:39.568008 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.567994 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 15:12:39.568159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.568036 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jjg2x\"" Apr 21 15:12:39.568159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.568057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 15:12:39.582282 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.582261 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv"] Apr 21 15:12:39.586408 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.586205 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5"] Apr 21 15:12:39.586408 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.586345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" Apr 21 15:12:39.588997 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.588975 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 15:12:39.589222 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.589096 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-k88sj\"" Apr 21 15:12:39.589727 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.589706 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:12:39.600348 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.600328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62tgx\" (UniqueName: \"kubernetes.io/projected/7af4b8c2-dc3e-4dbc-8156-428c4db62671-kube-api-access-62tgx\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.600443 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.600429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8d25\" (UniqueName: \"kubernetes.io/projected/00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77-kube-api-access-d8d25\") pod \"volume-data-source-validator-7c6cbb6c87-k5mdv\" (UID: \"00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" Apr 21 15:12:39.600486 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.600447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.600486 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.600466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7af4b8c2-dc3e-4dbc-8156-428c4db62671-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.638153 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.638131 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv"] Apr 21 15:12:39.666746 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.666720 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bqs6l"] Apr 21 15:12:39.669509 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.669484 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.676552 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.676527 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 21 15:12:39.677079 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.677049 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" type="*v1.ConfigMap" Apr 21 15:12:39.677188 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.677090 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"console-operator-dockercfg-ksd9p\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ksd9p\"" type="*v1.Secret" Apr 21 15:12:39.678048 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.678025 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"console-operator-config\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" type="*v1.ConfigMap" Apr 21 15:12:39.678149 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.678092 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 21 15:12:39.678207 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.678184 2575 status_manager.go:895] "Failed to get status for pod" podUID="2e6ce7e1-9053-481a-924e-dcb6e2859d45" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" err="pods \"console-operator-9d4b6777b-bqs6l\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" Apr 21 15:12:39.686866 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.686841 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:ip-10-0-131-11.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-131-11.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" type="*v1.Secret" Apr 21 15:12:39.700762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.700741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7af4b8c2-dc3e-4dbc-8156-428c4db62671-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.700882 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.700777 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-config\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.700882 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.700809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-trusted-ca\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.700991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.700876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62tgx\" (UniqueName: \"kubernetes.io/projected/7af4b8c2-dc3e-4dbc-8156-428c4db62671-kube-api-access-62tgx\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.700991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.700911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rckq\" (UniqueName: \"kubernetes.io/projected/2e6ce7e1-9053-481a-924e-dcb6e2859d45-kube-api-access-5rckq\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.700991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.700947 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6ce7e1-9053-481a-924e-dcb6e2859d45-serving-cert\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.701149 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.701040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d8d25\" (UniqueName: \"kubernetes.io/projected/00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77-kube-api-access-d8d25\") pod \"volume-data-source-validator-7c6cbb6c87-k5mdv\" (UID: \"00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" Apr 21 15:12:39.701149 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.701062 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.701149 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.701143 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:39.701295 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:39.701192 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls podName:7af4b8c2-dc3e-4dbc-8156-428c4db62671 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:40.201178389 +0000 UTC m=+128.796749242 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7q6q5" (UID: "7af4b8c2-dc3e-4dbc-8156-428c4db62671") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:39.701540 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.701521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7af4b8c2-dc3e-4dbc-8156-428c4db62671-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.714038 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.714016 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bqs6l"] Apr 21 15:12:39.721277 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.721255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62tgx\" (UniqueName: \"kubernetes.io/projected/7af4b8c2-dc3e-4dbc-8156-428c4db62671-kube-api-access-62tgx\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:39.732626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.732604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8d25\" (UniqueName: \"kubernetes.io/projected/00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77-kube-api-access-d8d25\") pod \"volume-data-source-validator-7c6cbb6c87-k5mdv\" (UID: \"00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" Apr 21 15:12:39.801856 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.801820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rckq\" (UniqueName: \"kubernetes.io/projected/2e6ce7e1-9053-481a-924e-dcb6e2859d45-kube-api-access-5rckq\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.802044 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.801863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6ce7e1-9053-481a-924e-dcb6e2859d45-serving-cert\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.802044 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.801927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-config\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.802044 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.801949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-trusted-ca\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:39.849222 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.849135 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b"] Apr 21 15:12:39.851988 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.851969 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx"] Apr 21 15:12:39.852152 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.852133 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:39.854470 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.854451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:39.856987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.856962 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 15:12:39.857214 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:12:39.857389 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 15:12:39.857553 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857537 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 15:12:39.857642 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857596 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 15:12:39.857705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857542 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-5twlz\"" Apr 21 15:12:39.857835 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857806 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 15:12:39.857900 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857832 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-tljw5\"" Apr 21 15:12:39.857954 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.857896 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 15:12:39.858127 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.858112 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:12:39.881023 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.881002 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx"] Apr 21 15:12:39.881759 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.881740 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b"] Apr 21 15:12:39.894968 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.894944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" Apr 21 15:12:39.902996 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.902970 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4gvz\" (UniqueName: \"kubernetes.io/projected/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-kube-api-access-m4gvz\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:39.903150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.903128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:39.903193 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.903167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mxg8\" (UniqueName: \"kubernetes.io/projected/54d54903-009c-4bc7-97cd-f27fea133502-kube-api-access-8mxg8\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:39.903265 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.903243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d54903-009c-4bc7-97cd-f27fea133502-config\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:39.903307 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.903292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d54903-009c-4bc7-97cd-f27fea133502-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:39.903362 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:39.903347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.003703 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.003667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m4gvz\" (UniqueName: \"kubernetes.io/projected/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-kube-api-access-m4gvz\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.003862 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.003782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.003862 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.003811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mxg8\" (UniqueName: \"kubernetes.io/projected/54d54903-009c-4bc7-97cd-f27fea133502-kube-api-access-8mxg8\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.003862 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.003848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d54903-009c-4bc7-97cd-f27fea133502-config\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.004042 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.004020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d54903-009c-4bc7-97cd-f27fea133502-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.004099 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.004078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.004550 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.004528 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.004954 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.004929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54d54903-009c-4bc7-97cd-f27fea133502-config\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.006142 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.006123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.006583 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.006563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54d54903-009c-4bc7-97cd-f27fea133502-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.020116 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.020091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mxg8\" (UniqueName: \"kubernetes.io/projected/54d54903-009c-4bc7-97cd-f27fea133502-kube-api-access-8mxg8\") pod \"service-ca-operator-d6fc45fc5-r4nrx\" (UID: \"54d54903-009c-4bc7-97cd-f27fea133502\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.020992 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.020974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4gvz\" (UniqueName: \"kubernetes.io/projected/b4f29a84-c932-46e5-8d58-0e2fb5ab05f8-kube-api-access-m4gvz\") pod \"kube-storage-version-migrator-operator-6769c5d45-wcz2b\" (UID: \"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.031648 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.031626 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv"] Apr 21 15:12:40.035315 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:12:40.035293 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b7c8c2_18f1_4ec5_a6f8_97f9dd462a77.slice/crio-39d73206c5e50f2159b3a98c9d7b38bd9ce31edd2ac114277ee9d89198c72ce4 WatchSource:0}: Error finding container 39d73206c5e50f2159b3a98c9d7b38bd9ce31edd2ac114277ee9d89198c72ce4: Status 404 returned error can't find the container with id 39d73206c5e50f2159b3a98c9d7b38bd9ce31edd2ac114277ee9d89198c72ce4 Apr 21 15:12:40.162533 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.162445 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" Apr 21 15:12:40.167288 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.167268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" Apr 21 15:12:40.205587 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.205544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:40.205786 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.205666 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:40.205786 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.205736 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls podName:7af4b8c2-dc3e-4dbc-8156-428c4db62671 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.205716733 +0000 UTC m=+129.801287585 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7q6q5" (UID: "7af4b8c2-dc3e-4dbc-8156-428c4db62671") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:40.287100 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.287071 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b"] Apr 21 15:12:40.289989 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:12:40.289960 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f29a84_c932_46e5_8d58_0e2fb5ab05f8.slice/crio-712d95a1cfd57f3a666922caee0ee5a03cd18f90aea91b858d5c9eef443556ce WatchSource:0}: Error finding container 712d95a1cfd57f3a666922caee0ee5a03cd18f90aea91b858d5c9eef443556ce: Status 404 returned error can't find the container with id 712d95a1cfd57f3a666922caee0ee5a03cd18f90aea91b858d5c9eef443556ce Apr 21 15:12:40.312953 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.312930 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx"] Apr 21 15:12:40.313235 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.313212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" event={"ID":"00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77","Type":"ContainerStarted","Data":"39d73206c5e50f2159b3a98c9d7b38bd9ce31edd2ac114277ee9d89198c72ce4"} Apr 21 15:12:40.314222 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.314197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" event={"ID":"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8","Type":"ContainerStarted","Data":"712d95a1cfd57f3a666922caee0ee5a03cd18f90aea91b858d5c9eef443556ce"} Apr 21 15:12:40.316183 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:12:40.316163 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d54903_009c_4bc7_97cd_f27fea133502.slice/crio-fcd3dba174b40b76274d42559acbc4bbe4e78494031532d21856af17e521ae3c WatchSource:0}: Error finding container fcd3dba174b40b76274d42559acbc4bbe4e78494031532d21856af17e521ae3c: Status 404 returned error can't find the container with id fcd3dba174b40b76274d42559acbc4bbe4e78494031532d21856af17e521ae3c Apr 21 15:12:40.697620 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.697587 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ksd9p\"" Apr 21 15:12:40.779306 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.779275 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:12:40.802349 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.802309 2575 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.802525 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.802412 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-config podName:2e6ce7e1-9053-481a-924e-dcb6e2859d45 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.302391065 +0000 UTC m=+129.897961919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-config") pod "console-operator-9d4b6777b-bqs6l" (UID: "2e6ce7e1-9053-481a-924e-dcb6e2859d45") : failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.802671 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.802645 2575 secret.go:189] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Apr 21 15:12:40.802671 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.802658 2575 configmap.go:193] Couldn't get configMap openshift-console-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.802823 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.802729 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-trusted-ca podName:2e6ce7e1-9053-481a-924e-dcb6e2859d45 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.302710075 +0000 UTC m=+129.898280948 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-trusted-ca") pod "console-operator-9d4b6777b-bqs6l" (UID: "2e6ce7e1-9053-481a-924e-dcb6e2859d45") : failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.802866 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.802828 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e6ce7e1-9053-481a-924e-dcb6e2859d45-serving-cert podName:2e6ce7e1-9053-481a-924e-dcb6e2859d45 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.302809512 +0000 UTC m=+129.898380373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2e6ce7e1-9053-481a-924e-dcb6e2859d45-serving-cert") pod "console-operator-9d4b6777b-bqs6l" (UID: "2e6ce7e1-9053-481a-924e-dcb6e2859d45") : failed to sync secret cache: timed out waiting for the condition Apr 21 15:12:40.815461 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.815434 2575 projected.go:289] Couldn't get configMap openshift-console-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.815584 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.815473 2575 projected.go:194] Error preparing data for projected volume kube-api-access-5rckq for pod openshift-console-operator/console-operator-9d4b6777b-bqs6l: failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.815584 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:40.815525 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e6ce7e1-9053-481a-924e-dcb6e2859d45-kube-api-access-5rckq podName:2e6ce7e1-9053-481a-924e-dcb6e2859d45 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:41.315507838 +0000 UTC m=+129.911078710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5rckq" (UniqueName: "kubernetes.io/projected/2e6ce7e1-9053-481a-924e-dcb6e2859d45-kube-api-access-5rckq") pod "console-operator-9d4b6777b-bqs6l" (UID: "2e6ce7e1-9053-481a-924e-dcb6e2859d45") : failed to sync configmap cache: timed out waiting for the condition Apr 21 15:12:40.853167 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.853137 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 15:12:40.956912 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:40.956883 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 15:12:41.030792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.030568 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 15:12:41.215699 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.215600 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:41.215875 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:41.215856 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:41.215950 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:41.215935 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls podName:7af4b8c2-dc3e-4dbc-8156-428c4db62671 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:43.215913618 +0000 UTC m=+131.811484470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7q6q5" (UID: "7af4b8c2-dc3e-4dbc-8156-428c4db62671") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:41.264391 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.264344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 15:12:41.316650 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.316298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rckq\" (UniqueName: \"kubernetes.io/projected/2e6ce7e1-9053-481a-924e-dcb6e2859d45-kube-api-access-5rckq\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.316650 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.316354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6ce7e1-9053-481a-924e-dcb6e2859d45-serving-cert\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.316650 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.316463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-config\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.316650 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.316497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-trusted-ca\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.318463 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.317759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-trusted-ca\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.318463 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.318263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e6ce7e1-9053-481a-924e-dcb6e2859d45-config\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.319507 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.319473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" event={"ID":"54d54903-009c-4bc7-97cd-f27fea133502","Type":"ContainerStarted","Data":"fcd3dba174b40b76274d42559acbc4bbe4e78494031532d21856af17e521ae3c"} Apr 21 15:12:41.321183 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.321157 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2e6ce7e1-9053-481a-924e-dcb6e2859d45-serving-cert\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.321678 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.321620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rckq\" (UniqueName: \"kubernetes.io/projected/2e6ce7e1-9053-481a-924e-dcb6e2859d45-kube-api-access-5rckq\") pod \"console-operator-9d4b6777b-bqs6l\" (UID: \"2e6ce7e1-9053-481a-924e-dcb6e2859d45\") " pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.478748 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.478665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:41.630889 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.630835 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-bqs6l"] Apr 21 15:12:41.634578 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:12:41.634542 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6ce7e1_9053_481a_924e_dcb6e2859d45.slice/crio-9c77034ba423ff8454509c9f5f9fa71001b95a3331b3625c216addeb39ff4f95 WatchSource:0}: Error finding container 9c77034ba423ff8454509c9f5f9fa71001b95a3331b3625c216addeb39ff4f95: Status 404 returned error can't find the container with id 9c77034ba423ff8454509c9f5f9fa71001b95a3331b3625c216addeb39ff4f95 Apr 21 15:12:41.720666 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:41.720628 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:12:41.721030 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:41.720734 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 15:12:41.721030 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:41.720810 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs podName:a17c6c3f-25ab-4414-92a4-946230c882ea nodeName:}" failed. No retries permitted until 2026-04-21 15:14:43.720787491 +0000 UTC m=+252.316358383 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs") pod "network-metrics-daemon-96snf" (UID: "a17c6c3f-25ab-4414-92a4-946230c882ea") : secret "metrics-daemon-secret" not found Apr 21 15:12:42.322788 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:42.322750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" event={"ID":"00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77","Type":"ContainerStarted","Data":"3c213ea45ab899c61ccc290241823a7146d4b581c4edff59366dfa9e2278dc73"} Apr 21 15:12:42.323862 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:42.323826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" event={"ID":"2e6ce7e1-9053-481a-924e-dcb6e2859d45","Type":"ContainerStarted","Data":"9c77034ba423ff8454509c9f5f9fa71001b95a3331b3625c216addeb39ff4f95"} Apr 21 15:12:42.350260 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:42.350215 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k5mdv" podStartSLOduration=1.852181112 podStartE2EDuration="3.350202964s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:12:40.036995757 +0000 UTC m=+128.632566610" lastFinishedPulling="2026-04-21 15:12:41.535017599 +0000 UTC m=+130.130588462" observedRunningTime="2026-04-21 15:12:42.348643265 +0000 UTC m=+130.944214142" watchObservedRunningTime="2026-04-21 15:12:42.350202964 +0000 UTC m=+130.945773838" Apr 21 15:12:43.232776 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:43.232736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:43.233219 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:43.232900 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:43.233219 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:43.232991 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls podName:7af4b8c2-dc3e-4dbc-8156-428c4db62671 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:47.232967665 +0000 UTC m=+135.828538533 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7q6q5" (UID: "7af4b8c2-dc3e-4dbc-8156-428c4db62671") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:43.328837 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:43.328804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" event={"ID":"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8","Type":"ContainerStarted","Data":"0982c523f6537da4214019e5a0c65f2035bbc9ba58685eed397c4432373938bf"} Apr 21 15:12:43.330787 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:43.330757 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" event={"ID":"54d54903-009c-4bc7-97cd-f27fea133502","Type":"ContainerStarted","Data":"6f7baf21c795f6906b56b8d4a1530bc649c9e4c16d6cde076efd01828b5dc008"} Apr 21 15:12:43.398515 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:43.398291 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" podStartSLOduration=1.9961567009999999 podStartE2EDuration="4.398275179s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:12:40.317924252 +0000 UTC m=+128.913495105" lastFinishedPulling="2026-04-21 15:12:42.720042727 +0000 UTC m=+131.315613583" observedRunningTime="2026-04-21 15:12:43.397582928 +0000 UTC m=+131.993153804" watchObservedRunningTime="2026-04-21 15:12:43.398275179 +0000 UTC m=+131.993846055" Apr 21 15:12:43.399167 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:43.399130 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" podStartSLOduration=1.96900983 podStartE2EDuration="4.399118056s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:12:40.291700087 +0000 UTC m=+128.887270944" lastFinishedPulling="2026-04-21 15:12:42.721808318 +0000 UTC m=+131.317379170" observedRunningTime="2026-04-21 15:12:43.358212957 +0000 UTC m=+131.953783833" watchObservedRunningTime="2026-04-21 15:12:43.399118056 +0000 UTC m=+131.994688931" Apr 21 15:12:44.335276 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:44.335247 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/0.log" Apr 21 15:12:44.335717 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:44.335287 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e6ce7e1-9053-481a-924e-dcb6e2859d45" containerID="2b9f0cafe076c36fda549f753b873ab031c734cafe138cac01f41cabd2d0b793" exitCode=255 Apr 21 15:12:44.335717 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:44.335403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" event={"ID":"2e6ce7e1-9053-481a-924e-dcb6e2859d45","Type":"ContainerDied","Data":"2b9f0cafe076c36fda549f753b873ab031c734cafe138cac01f41cabd2d0b793"} Apr 21 15:12:44.335717 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:44.335652 2575 scope.go:117] "RemoveContainer" containerID="2b9f0cafe076c36fda549f753b873ab031c734cafe138cac01f41cabd2d0b793" Apr 21 15:12:45.339106 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:45.339076 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:12:45.339623 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:45.339494 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/0.log" Apr 21 15:12:45.339623 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:45.339527 2575 generic.go:358] "Generic (PLEG): container finished" podID="2e6ce7e1-9053-481a-924e-dcb6e2859d45" containerID="55d6e5013f875ba6db3c71fb8a33b223490ea682cc4d9882b2ba9bc1865b54f2" exitCode=255 Apr 21 15:12:45.339623 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:45.339587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" event={"ID":"2e6ce7e1-9053-481a-924e-dcb6e2859d45","Type":"ContainerDied","Data":"55d6e5013f875ba6db3c71fb8a33b223490ea682cc4d9882b2ba9bc1865b54f2"} Apr 21 15:12:45.339623 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:45.339623 2575 scope.go:117] "RemoveContainer" containerID="2b9f0cafe076c36fda549f753b873ab031c734cafe138cac01f41cabd2d0b793" Apr 21 15:12:45.339838 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:45.339820 2575 scope.go:117] "RemoveContainer" containerID="55d6e5013f875ba6db3c71fb8a33b223490ea682cc4d9882b2ba9bc1865b54f2" Apr 21 15:12:45.340024 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:45.340008 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bqs6l_openshift-console-operator(2e6ce7e1-9053-481a-924e-dcb6e2859d45)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" podUID="2e6ce7e1-9053-481a-924e-dcb6e2859d45" Apr 21 15:12:46.343246 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:46.343219 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:12:46.343642 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:46.343556 2575 scope.go:117] "RemoveContainer" containerID="55d6e5013f875ba6db3c71fb8a33b223490ea682cc4d9882b2ba9bc1865b54f2" Apr 21 15:12:46.343730 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:46.343713 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bqs6l_openshift-console-operator(2e6ce7e1-9053-481a-924e-dcb6e2859d45)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" podUID="2e6ce7e1-9053-481a-924e-dcb6e2859d45" Apr 21 15:12:47.060325 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.060290 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lztd7"] Apr 21 15:12:47.064456 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.064438 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.070023 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.069997 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 15:12:47.070871 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.070769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-7cwnj\"" Apr 21 15:12:47.070871 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.070777 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 15:12:47.070871 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.070821 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 15:12:47.071090 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.071074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 15:12:47.082367 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.082343 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lztd7"] Apr 21 15:12:47.164652 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.164609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fbqk\" (UniqueName: \"kubernetes.io/projected/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-kube-api-access-8fbqk\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.164652 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.164657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-signing-key\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.164852 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.164759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-signing-cabundle\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.266019 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.265966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-signing-cabundle\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.266193 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.266076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8fbqk\" (UniqueName: \"kubernetes.io/projected/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-kube-api-access-8fbqk\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.266193 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.266112 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-signing-key\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.266193 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.266148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:47.266294 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:47.266242 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:47.266336 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:47.266302 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls podName:7af4b8c2-dc3e-4dbc-8156-428c4db62671 nodeName:}" failed. No retries permitted until 2026-04-21 15:12:55.266282545 +0000 UTC m=+143.861853401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7q6q5" (UID: "7af4b8c2-dc3e-4dbc-8156-428c4db62671") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:47.266803 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.266785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-signing-cabundle\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.268696 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.268675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-signing-key\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.275021 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.274999 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fbqk\" (UniqueName: \"kubernetes.io/projected/f0d6f5bc-43d9-4178-9d4f-96e44160cd78-kube-api-access-8fbqk\") pod \"service-ca-865cb79987-lztd7\" (UID: \"f0d6f5bc-43d9-4178-9d4f-96e44160cd78\") " pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.373211 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.373129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-lztd7" Apr 21 15:12:47.494195 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:47.494167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-lztd7"] Apr 21 15:12:47.495431 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:12:47.495403 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d6f5bc_43d9_4178_9d4f_96e44160cd78.slice/crio-0415ca6771b649e434923ae988c6409d74dba87c74eb70d4c28f05f153dc5539 WatchSource:0}: Error finding container 0415ca6771b649e434923ae988c6409d74dba87c74eb70d4c28f05f153dc5539: Status 404 returned error can't find the container with id 0415ca6771b649e434923ae988c6409d74dba87c74eb70d4c28f05f153dc5539 Apr 21 15:12:48.074715 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:48.074685 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hdbc9_71299419-e249-4660-891c-24ba490f5c36/dns-node-resolver/0.log" Apr 21 15:12:48.350601 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:48.350519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-lztd7" event={"ID":"f0d6f5bc-43d9-4178-9d4f-96e44160cd78","Type":"ContainerStarted","Data":"ee9bc71dcbe45b721b1ffdaa80221c27a808c98fb9ca109bf305efa57a149a7c"} Apr 21 15:12:48.350601 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:48.350554 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-lztd7" event={"ID":"f0d6f5bc-43d9-4178-9d4f-96e44160cd78","Type":"ContainerStarted","Data":"0415ca6771b649e434923ae988c6409d74dba87c74eb70d4c28f05f153dc5539"} Apr 21 15:12:48.875286 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:48.875255 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4hwvs_cfb26da3-8175-4742-a038-7b5d5d082af2/node-ca/0.log" Apr 21 15:12:50.674875 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:50.674842 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wcz2b_b4f29a84-c932-46e5-8d58-0e2fb5ab05f8/kube-storage-version-migrator-operator/0.log" Apr 21 15:12:51.479365 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:51.479320 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:51.479365 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:51.479366 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:12:51.479754 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:51.479741 2575 scope.go:117] "RemoveContainer" containerID="55d6e5013f875ba6db3c71fb8a33b223490ea682cc4d9882b2ba9bc1865b54f2" Apr 21 15:12:51.479921 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:51.479905 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-bqs6l_openshift-console-operator(2e6ce7e1-9053-481a-924e-dcb6e2859d45)\"" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" podUID="2e6ce7e1-9053-481a-924e-dcb6e2859d45" Apr 21 15:12:55.331053 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:12:55.331017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:12:55.331462 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:55.331206 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 15:12:55.331462 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:12:55.331289 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls podName:7af4b8c2-dc3e-4dbc-8156-428c4db62671 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:11.331268414 +0000 UTC m=+159.926839268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-7q6q5" (UID: "7af4b8c2-dc3e-4dbc-8156-428c4db62671") : secret "cluster-monitoring-operator-tls" not found Apr 21 15:13:05.975273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:05.975233 2575 scope.go:117] "RemoveContainer" containerID="55d6e5013f875ba6db3c71fb8a33b223490ea682cc4d9882b2ba9bc1865b54f2" Apr 21 15:13:06.397763 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:06.397687 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:13:06.397903 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:06.397765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" event={"ID":"2e6ce7e1-9053-481a-924e-dcb6e2859d45","Type":"ContainerStarted","Data":"f294e056413922cda9f6a50d83101a8e0a90150a6cc3f876cb662d637b85e984"} Apr 21 15:13:06.398065 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:06.398047 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:13:06.420790 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:06.420713 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-lztd7" podStartSLOduration=19.420701583 podStartE2EDuration="19.420701583s" podCreationTimestamp="2026-04-21 15:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:12:48.376505726 +0000 UTC m=+136.972076613" watchObservedRunningTime="2026-04-21 15:13:06.420701583 +0000 UTC m=+155.016272457" Apr 21 15:13:06.420936 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:06.420873 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" podStartSLOduration=25.24621338 podStartE2EDuration="27.420865325s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:12:41.636517854 +0000 UTC m=+130.232088707" lastFinishedPulling="2026-04-21 15:12:43.811169792 +0000 UTC m=+132.406740652" observedRunningTime="2026-04-21 15:13:06.419924329 +0000 UTC m=+155.015495217" watchObservedRunningTime="2026-04-21 15:13:06.420865325 +0000 UTC m=+155.016436201" Apr 21 15:13:06.806943 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:06.806913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-bqs6l" Apr 21 15:13:08.892430 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:08.892361 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" Apr 21 15:13:08.908618 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:08.908591 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-fr77k" podUID="a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be" Apr 21 15:13:08.926276 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:08.926239 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dmzt8" podUID="578cec2c-16fd-469e-931a-b7cf421795a1" Apr 21 15:13:08.995139 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:08.995101 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-96snf" podUID="a17c6c3f-25ab-4414-92a4-946230c882ea" Apr 21 15:13:09.405478 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:09.405449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fr77k" Apr 21 15:13:09.405658 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:09.405489 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:13:11.359777 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:11.359731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:13:11.362187 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:11.362162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7af4b8c2-dc3e-4dbc-8156-428c4db62671-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-7q6q5\" (UID: \"7af4b8c2-dc3e-4dbc-8156-428c4db62671\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:13:11.368082 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:11.368061 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" Apr 21 15:13:11.487855 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:11.487822 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5"] Apr 21 15:13:11.491076 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:11.491044 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af4b8c2_dc3e_4dbc_8156_428c4db62671.slice/crio-5f00762cafa83e36a93247f872fbd0adeedc2a7894bd9a939246a705b4f4288e WatchSource:0}: Error finding container 5f00762cafa83e36a93247f872fbd0adeedc2a7894bd9a939246a705b4f4288e: Status 404 returned error can't find the container with id 5f00762cafa83e36a93247f872fbd0adeedc2a7894bd9a939246a705b4f4288e Apr 21 15:13:12.418991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.418959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" event={"ID":"7af4b8c2-dc3e-4dbc-8156-428c4db62671","Type":"ContainerStarted","Data":"5f00762cafa83e36a93247f872fbd0adeedc2a7894bd9a939246a705b4f4288e"} Apr 21 15:13:12.844775 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.844742 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-46kkk"] Apr 21 15:13:12.849645 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.849617 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:12.853907 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.853876 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5pq58\"" Apr 21 15:13:12.854249 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.854228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 15:13:12.854928 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.854885 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 15:13:12.855150 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.855128 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 15:13:12.855271 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.855255 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 15:13:12.874537 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.874514 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-46kkk"] Apr 21 15:13:12.923425 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.923388 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq"] Apr 21 15:13:12.926537 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.926501 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:12.932631 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.932608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 15:13:12.932993 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.932972 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 15:13:12.933101 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.932994 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-t25vf"] Apr 21 15:13:12.933825 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.933803 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-z69g7\"" Apr 21 15:13:12.936326 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.936308 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:12.939460 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.938959 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 15:13:12.939460 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.939080 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-7r4rh\"" Apr 21 15:13:12.939460 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.939257 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 15:13:12.939460 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.939423 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq"] Apr 21 15:13:12.952760 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.952736 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-t25vf"] Apr 21 15:13:12.973395 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.973347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a8c54d77-8e37-4e0f-8831-80a62651bcaa-crio-socket\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:12.973554 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.973398 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a8c54d77-8e37-4e0f-8831-80a62651bcaa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:12.973554 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.973499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a8c54d77-8e37-4e0f-8831-80a62651bcaa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:12.973554 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.973546 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvxb\" (UniqueName: \"kubernetes.io/projected/a8c54d77-8e37-4e0f-8831-80a62651bcaa-kube-api-access-ssvxb\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:12.973690 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:12.973576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a8c54d77-8e37-4e0f-8831-80a62651bcaa-data-volume\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074445 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8e66f46-305b-420c-9537-07986d9fd92a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gdmsq\" (UID: \"b8e66f46-305b-420c-9537-07986d9fd92a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.074590 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssvxb\" (UniqueName: \"kubernetes.io/projected/a8c54d77-8e37-4e0f-8831-80a62651bcaa-kube-api-access-ssvxb\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074590 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a8c54d77-8e37-4e0f-8831-80a62651bcaa-data-volume\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074590 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8e66f46-305b-420c-9537-07986d9fd92a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gdmsq\" (UID: \"b8e66f46-305b-420c-9537-07986d9fd92a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.074722 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074642 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a8c54d77-8e37-4e0f-8831-80a62651bcaa-crio-socket\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074722 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a8c54d77-8e37-4e0f-8831-80a62651bcaa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074817 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a8c54d77-8e37-4e0f-8831-80a62651bcaa-crio-socket\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074817 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a8c54d77-8e37-4e0f-8831-80a62651bcaa-data-volume\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.074911 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhfx\" (UniqueName: \"kubernetes.io/projected/abead490-d43d-4f40-bf13-41e9c8573f7f-kube-api-access-7jhfx\") pod \"downloads-6bcc868b7-t25vf\" (UID: \"abead490-d43d-4f40-bf13-41e9c8573f7f\") " pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:13.074962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.074949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a8c54d77-8e37-4e0f-8831-80a62651bcaa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.075180 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.075144 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a8c54d77-8e37-4e0f-8831-80a62651bcaa-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.077322 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.077303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a8c54d77-8e37-4e0f-8831-80a62651bcaa-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.097521 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.097465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssvxb\" (UniqueName: \"kubernetes.io/projected/a8c54d77-8e37-4e0f-8831-80a62651bcaa-kube-api-access-ssvxb\") pod \"insights-runtime-extractor-46kkk\" (UID: \"a8c54d77-8e37-4e0f-8831-80a62651bcaa\") " pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.161632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.161599 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-46kkk" Apr 21 15:13:13.176435 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.176406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8e66f46-305b-420c-9537-07986d9fd92a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gdmsq\" (UID: \"b8e66f46-305b-420c-9537-07986d9fd92a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.176536 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.176480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhfx\" (UniqueName: \"kubernetes.io/projected/abead490-d43d-4f40-bf13-41e9c8573f7f-kube-api-access-7jhfx\") pod \"downloads-6bcc868b7-t25vf\" (UID: \"abead490-d43d-4f40-bf13-41e9c8573f7f\") " pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:13.176613 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.176550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8e66f46-305b-420c-9537-07986d9fd92a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gdmsq\" (UID: \"b8e66f46-305b-420c-9537-07986d9fd92a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.177141 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.177113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b8e66f46-305b-420c-9537-07986d9fd92a-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-gdmsq\" (UID: \"b8e66f46-305b-420c-9537-07986d9fd92a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.179247 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.179223 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b8e66f46-305b-420c-9537-07986d9fd92a-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-gdmsq\" (UID: \"b8e66f46-305b-420c-9537-07986d9fd92a\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.186257 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.186234 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhfx\" (UniqueName: \"kubernetes.io/projected/abead490-d43d-4f40-bf13-41e9c8573f7f-kube-api-access-7jhfx\") pod \"downloads-6bcc868b7-t25vf\" (UID: \"abead490-d43d-4f40-bf13-41e9c8573f7f\") " pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:13.238235 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.238208 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" Apr 21 15:13:13.247800 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.247698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:13.303604 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.302791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-46kkk"] Apr 21 15:13:13.306998 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:13.306931 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8c54d77_8e37_4e0f_8831_80a62651bcaa.slice/crio-48a8b13097830bde2f6dabdac5fc3d55a02ae7ce4e14402eb128bb7a2161fef0 WatchSource:0}: Error finding container 48a8b13097830bde2f6dabdac5fc3d55a02ae7ce4e14402eb128bb7a2161fef0: Status 404 returned error can't find the container with id 48a8b13097830bde2f6dabdac5fc3d55a02ae7ce4e14402eb128bb7a2161fef0 Apr 21 15:13:13.397511 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.397480 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq"] Apr 21 15:13:13.400677 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:13.400650 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8e66f46_305b_420c_9537_07986d9fd92a.slice/crio-862b9d3dc01fdafe34ad95ce12bf887b72ec8006983d6abb6c04300680647562 WatchSource:0}: Error finding container 862b9d3dc01fdafe34ad95ce12bf887b72ec8006983d6abb6c04300680647562: Status 404 returned error can't find the container with id 862b9d3dc01fdafe34ad95ce12bf887b72ec8006983d6abb6c04300680647562 Apr 21 15:13:13.423450 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.423408 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-t25vf"] Apr 21 15:13:13.424796 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.424746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" event={"ID":"7af4b8c2-dc3e-4dbc-8156-428c4db62671","Type":"ContainerStarted","Data":"efc73dcd6d9bd4a2ca868cab0b45611425097c7d4d3113647c30300cb8236417"} Apr 21 15:13:13.426348 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.426319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-46kkk" event={"ID":"a8c54d77-8e37-4e0f-8831-80a62651bcaa","Type":"ContainerStarted","Data":"894645c8cc9efce7ed25cb311348c912b63f7158234c4852733db6ba9607184e"} Apr 21 15:13:13.426476 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.426356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-46kkk" event={"ID":"a8c54d77-8e37-4e0f-8831-80a62651bcaa","Type":"ContainerStarted","Data":"48a8b13097830bde2f6dabdac5fc3d55a02ae7ce4e14402eb128bb7a2161fef0"} Apr 21 15:13:13.427543 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:13.427506 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabead490_d43d_4f40_bf13_41e9c8573f7f.slice/crio-9896bc8c5979e5ff79a2ec41eccc491e5286ab2ea53c1eaec6cb0db45e6e460d WatchSource:0}: Error finding container 9896bc8c5979e5ff79a2ec41eccc491e5286ab2ea53c1eaec6cb0db45e6e460d: Status 404 returned error can't find the container with id 9896bc8c5979e5ff79a2ec41eccc491e5286ab2ea53c1eaec6cb0db45e6e460d Apr 21 15:13:13.429189 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.428884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" event={"ID":"b8e66f46-305b-420c-9537-07986d9fd92a","Type":"ContainerStarted","Data":"862b9d3dc01fdafe34ad95ce12bf887b72ec8006983d6abb6c04300680647562"} Apr 21 15:13:13.448464 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.448420 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-7q6q5" podStartSLOduration=32.808634498 podStartE2EDuration="34.448405629s" podCreationTimestamp="2026-04-21 15:12:39 +0000 UTC" firstStartedPulling="2026-04-21 15:13:11.492713814 +0000 UTC m=+160.088284667" lastFinishedPulling="2026-04-21 15:13:13.132484945 +0000 UTC m=+161.728055798" observedRunningTime="2026-04-21 15:13:13.447987669 +0000 UTC m=+162.043558545" watchObservedRunningTime="2026-04-21 15:13:13.448405629 +0000 UTC m=+162.043976501" Apr 21 15:13:13.716744 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.716713 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42"] Apr 21 15:13:13.720965 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.720945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:13.723460 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.723441 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-tqn7k\"" Apr 21 15:13:13.724047 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.724028 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 15:13:13.743600 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.743566 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42"] Apr 21 15:13:13.782465 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.782426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgs42\" (UID: \"9de978e2-783c-4dfd-ac14-798e4da5e14a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:13.884835 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.883708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:13:13.884835 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.883854 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:13:13.885392 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.883932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:13:13.885607 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.885529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgs42\" (UID: \"9de978e2-783c-4dfd-ac14-798e4da5e14a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:13.885697 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:13.885660 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 15:13:13.885770 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:13.885735 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates podName:9de978e2-783c-4dfd-ac14-798e4da5e14a nodeName:}" failed. No retries permitted until 2026-04-21 15:13:14.385713226 +0000 UTC m=+162.981284102 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-cgs42" (UID: "9de978e2-783c-4dfd-ac14-798e4da5e14a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 15:13:13.889918 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.889873 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be-metrics-tls\") pod \"dns-default-fr77k\" (UID: \"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be\") " pod="openshift-dns/dns-default-fr77k" Apr 21 15:13:13.890947 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.890899 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"image-registry-85fb9cc7f7-zl6lq\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:13:13.892320 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.892297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/578cec2c-16fd-469e-931a-b7cf421795a1-cert\") pod \"ingress-canary-dmzt8\" (UID: \"578cec2c-16fd-469e-931a-b7cf421795a1\") " pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:13:13.908632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.908606 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-gcqst\"" Apr 21 15:13:13.909923 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.909689 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-76666\"" Apr 21 15:13:13.917342 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.917109 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:13:13.917342 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:13.917182 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fr77k" Apr 21 15:13:14.091047 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.090984 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-85fb9cc7f7-zl6lq"] Apr 21 15:13:14.111152 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.111120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fr77k"] Apr 21 15:13:14.119035 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:14.116789 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a95ba4_2cdf_4ab6_8c6f_5269bfdca8be.slice/crio-86ff97e7458c8ca241e69486dcbfeac7f8a98eb9cc5d561a0ecfa48e549f5376 WatchSource:0}: Error finding container 86ff97e7458c8ca241e69486dcbfeac7f8a98eb9cc5d561a0ecfa48e549f5376: Status 404 returned error can't find the container with id 86ff97e7458c8ca241e69486dcbfeac7f8a98eb9cc5d561a0ecfa48e549f5376 Apr 21 15:13:14.390326 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.390238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgs42\" (UID: \"9de978e2-783c-4dfd-ac14-798e4da5e14a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:14.390530 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:14.390423 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 15:13:14.390530 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:14.390505 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates podName:9de978e2-783c-4dfd-ac14-798e4da5e14a nodeName:}" failed. No retries permitted until 2026-04-21 15:13:15.39048451 +0000 UTC m=+163.986055366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-cgs42" (UID: "9de978e2-783c-4dfd-ac14-798e4da5e14a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 15:13:14.436007 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.435954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-46kkk" event={"ID":"a8c54d77-8e37-4e0f-8831-80a62651bcaa","Type":"ContainerStarted","Data":"cc0a0679f0af258eed3759f9ba608d05003c2fe509ce4ea36b0253b4fbefadf2"} Apr 21 15:13:14.437350 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.437307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fr77k" event={"ID":"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be","Type":"ContainerStarted","Data":"86ff97e7458c8ca241e69486dcbfeac7f8a98eb9cc5d561a0ecfa48e549f5376"} Apr 21 15:13:14.438966 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.438918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-t25vf" event={"ID":"abead490-d43d-4f40-bf13-41e9c8573f7f","Type":"ContainerStarted","Data":"9896bc8c5979e5ff79a2ec41eccc491e5286ab2ea53c1eaec6cb0db45e6e460d"} Apr 21 15:13:14.442987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.442900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" event={"ID":"da72ba61-5c83-47ce-a285-a58cd7b77246","Type":"ContainerStarted","Data":"3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f"} Apr 21 15:13:14.442987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.442927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" event={"ID":"da72ba61-5c83-47ce-a285-a58cd7b77246","Type":"ContainerStarted","Data":"f391ca4ea8a23ee78c1c16adae2580ae77cf82628b33d79e1a6f89c8f952225b"} Apr 21 15:13:14.442987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.442963 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:13:14.470812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:14.469463 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" podStartSLOduration=162.469444594 podStartE2EDuration="2m42.469444594s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:13:14.46912212 +0000 UTC m=+163.064692996" watchObservedRunningTime="2026-04-21 15:13:14.469444594 +0000 UTC m=+163.065015470" Apr 21 15:13:15.402084 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:15.401981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgs42\" (UID: \"9de978e2-783c-4dfd-ac14-798e4da5e14a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:15.405078 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:15.405044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9de978e2-783c-4dfd-ac14-798e4da5e14a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-cgs42\" (UID: \"9de978e2-783c-4dfd-ac14-798e4da5e14a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:15.446795 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:15.446733 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" event={"ID":"b8e66f46-305b-420c-9537-07986d9fd92a","Type":"ContainerStarted","Data":"1eb6eed7ddd7c1aee102e338ae9ef07eef90a3007ed548840a19171f115b3437"} Apr 21 15:13:15.530550 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:15.530517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:16.792779 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:16.792727 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-gdmsq" podStartSLOduration=3.478544664 podStartE2EDuration="4.792708033s" podCreationTimestamp="2026-04-21 15:13:12 +0000 UTC" firstStartedPulling="2026-04-21 15:13:13.402631791 +0000 UTC m=+161.998202644" lastFinishedPulling="2026-04-21 15:13:14.716795154 +0000 UTC m=+163.312366013" observedRunningTime="2026-04-21 15:13:15.464940849 +0000 UTC m=+164.060511725" watchObservedRunningTime="2026-04-21 15:13:16.792708033 +0000 UTC m=+165.388278909" Apr 21 15:13:16.793405 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:16.793364 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42"] Apr 21 15:13:16.796065 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:16.796036 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de978e2_783c_4dfd_ac14_798e4da5e14a.slice/crio-601449bc6719eb8b85b71d1ef6bb69d984a4cc76c9d89b0bac452239245713ee WatchSource:0}: Error finding container 601449bc6719eb8b85b71d1ef6bb69d984a4cc76c9d89b0bac452239245713ee: Status 404 returned error can't find the container with id 601449bc6719eb8b85b71d1ef6bb69d984a4cc76c9d89b0bac452239245713ee Apr 21 15:13:17.454719 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.454668 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-46kkk" event={"ID":"a8c54d77-8e37-4e0f-8831-80a62651bcaa","Type":"ContainerStarted","Data":"b3e6ec48ea411396f3f81563cbb2d3ead46e182b0e74af5cdbc18e686462be47"} Apr 21 15:13:17.457326 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.457293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fr77k" event={"ID":"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be","Type":"ContainerStarted","Data":"0b9451807400fcfa34d8ab5119fbee80f5db263b651acf230a1cd65da17bd7b3"} Apr 21 15:13:17.457489 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.457331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fr77k" event={"ID":"a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be","Type":"ContainerStarted","Data":"485cea91d048454401b95319b6d4abf53435f4787626c1e1059a776efd2847a8"} Apr 21 15:13:17.457870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.457845 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fr77k" Apr 21 15:13:17.459392 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.459338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" event={"ID":"9de978e2-783c-4dfd-ac14-798e4da5e14a","Type":"ContainerStarted","Data":"601449bc6719eb8b85b71d1ef6bb69d984a4cc76c9d89b0bac452239245713ee"} Apr 21 15:13:17.479785 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.479732 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-46kkk" podStartSLOduration=2.24759254 podStartE2EDuration="5.479721616s" podCreationTimestamp="2026-04-21 15:13:12 +0000 UTC" firstStartedPulling="2026-04-21 15:13:13.400582282 +0000 UTC m=+161.996153138" lastFinishedPulling="2026-04-21 15:13:16.632711358 +0000 UTC m=+165.228282214" observedRunningTime="2026-04-21 15:13:17.479112527 +0000 UTC m=+166.074683403" watchObservedRunningTime="2026-04-21 15:13:17.479721616 +0000 UTC m=+166.075292491" Apr 21 15:13:17.505931 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.505838 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fr77k" podStartSLOduration=129.986740911 podStartE2EDuration="2m12.505819486s" podCreationTimestamp="2026-04-21 15:11:05 +0000 UTC" firstStartedPulling="2026-04-21 15:13:14.11891087 +0000 UTC m=+162.714481728" lastFinishedPulling="2026-04-21 15:13:16.637989438 +0000 UTC m=+165.233560303" observedRunningTime="2026-04-21 15:13:17.505216718 +0000 UTC m=+166.100787592" watchObservedRunningTime="2026-04-21 15:13:17.505819486 +0000 UTC m=+166.101390362" Apr 21 15:13:17.957966 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.957803 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fcb5c5f8d-mn66d"] Apr 21 15:13:17.978717 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.978690 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fcb5c5f8d-mn66d"] Apr 21 15:13:17.979052 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.979026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:17.982033 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.982009 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 15:13:17.983226 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.983093 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 15:13:17.983226 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.983152 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 15:13:17.983226 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.983154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j2gn9\"" Apr 21 15:13:17.983226 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.983100 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 15:13:17.983559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:17.983333 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 15:13:18.127739 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.127707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-oauth-serving-cert\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.127909 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.127770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-serving-cert\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.127909 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.127806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-service-ca\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.127909 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.127881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-oauth-config\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.128048 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.127928 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-config\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.128048 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.127974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4skj\" (UniqueName: \"kubernetes.io/projected/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-kube-api-access-q4skj\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229070 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.228968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q4skj\" (UniqueName: \"kubernetes.io/projected/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-kube-api-access-q4skj\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229236 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.229068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-oauth-serving-cert\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229236 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.229115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-serving-cert\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229236 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.229138 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-service-ca\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229423 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.229307 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-oauth-config\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229423 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.229362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-config\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.229952 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.229924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-oauth-serving-cert\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.230067 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.230003 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-service-ca\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.230067 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.230055 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-config\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.232264 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.232240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-oauth-config\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.232428 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.232406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-serving-cert\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.239513 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.239489 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4skj\" (UniqueName: \"kubernetes.io/projected/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-kube-api-access-q4skj\") pod \"console-6fcb5c5f8d-mn66d\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.291729 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.291690 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:18.435562 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.435353 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fcb5c5f8d-mn66d"] Apr 21 15:13:18.438168 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:18.438137 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1af4f4a_515b_4c5b_966c_9bfea44cd4b5.slice/crio-61b43b4a84ea985cd241b6ec28a1b18e676df556f2ad8f9a0a4cbf3866bf3e6d WatchSource:0}: Error finding container 61b43b4a84ea985cd241b6ec28a1b18e676df556f2ad8f9a0a4cbf3866bf3e6d: Status 404 returned error can't find the container with id 61b43b4a84ea985cd241b6ec28a1b18e676df556f2ad8f9a0a4cbf3866bf3e6d Apr 21 15:13:18.463664 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.463638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" event={"ID":"9de978e2-783c-4dfd-ac14-798e4da5e14a","Type":"ContainerStarted","Data":"79c12d2d3344e39b43aa06f3ac8389a9ea99b3bcd6fdf7bc2fcbc6e8ede402df"} Apr 21 15:13:18.463929 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.463906 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:18.464993 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.464956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcb5c5f8d-mn66d" event={"ID":"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5","Type":"ContainerStarted","Data":"61b43b4a84ea985cd241b6ec28a1b18e676df556f2ad8f9a0a4cbf3866bf3e6d"} Apr 21 15:13:18.469237 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.469197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" Apr 21 15:13:18.479785 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.479711 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-cgs42" podStartSLOduration=4.012439132 podStartE2EDuration="5.479700531s" podCreationTimestamp="2026-04-21 15:13:13 +0000 UTC" firstStartedPulling="2026-04-21 15:13:16.798522372 +0000 UTC m=+165.394093230" lastFinishedPulling="2026-04-21 15:13:18.265783776 +0000 UTC m=+166.861354629" observedRunningTime="2026-04-21 15:13:18.479221429 +0000 UTC m=+167.074792303" watchObservedRunningTime="2026-04-21 15:13:18.479700531 +0000 UTC m=+167.075271405" Apr 21 15:13:18.789977 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.789895 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-6m8bq"] Apr 21 15:13:18.793544 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.793520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:18.797277 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.797253 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 15:13:18.797428 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.797311 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 15:13:18.797428 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.797346 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-bdqvg\"" Apr 21 15:13:18.797563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.797483 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 15:13:18.806027 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.806000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-6m8bq"] Apr 21 15:13:18.935562 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.935514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th9nl\" (UniqueName: \"kubernetes.io/projected/064e9703-0c22-4ecf-8b43-2473e8986b8b-kube-api-access-th9nl\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:18.935754 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.935593 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:18.935754 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.935641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/064e9703-0c22-4ecf-8b43-2473e8986b8b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:18.935754 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:18.935701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.037265 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.037018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.037265 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.037081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th9nl\" (UniqueName: \"kubernetes.io/projected/064e9703-0c22-4ecf-8b43-2473e8986b8b-kube-api-access-th9nl\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.037265 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.037141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.037265 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.037185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/064e9703-0c22-4ecf-8b43-2473e8986b8b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.037265 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:19.037192 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 15:13:19.037265 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:19.037265 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls podName:064e9703-0c22-4ecf-8b43-2473e8986b8b nodeName:}" failed. No retries permitted until 2026-04-21 15:13:19.537244119 +0000 UTC m=+168.132814987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-6m8bq" (UID: "064e9703-0c22-4ecf-8b43-2473e8986b8b") : secret "prometheus-operator-tls" not found Apr 21 15:13:19.038017 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.037921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/064e9703-0c22-4ecf-8b43-2473e8986b8b-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.042876 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.042813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.047346 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.047299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th9nl\" (UniqueName: \"kubernetes.io/projected/064e9703-0c22-4ecf-8b43-2473e8986b8b-kube-api-access-th9nl\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.542672 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:19.542626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:19.544098 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:19.543461 2575 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 21 15:13:19.544098 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:19.543524 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls podName:064e9703-0c22-4ecf-8b43-2473e8986b8b nodeName:}" failed. No retries permitted until 2026-04-21 15:13:20.543503989 +0000 UTC m=+169.139074850 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-6m8bq" (UID: "064e9703-0c22-4ecf-8b43-2473e8986b8b") : secret "prometheus-operator-tls" not found Apr 21 15:13:20.552495 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:20.552448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:20.555486 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:20.555428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/064e9703-0c22-4ecf-8b43-2473e8986b8b-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-6m8bq\" (UID: \"064e9703-0c22-4ecf-8b43-2473e8986b8b\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:20.605637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:20.605529 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" Apr 21 15:13:20.975641 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:20.975572 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:13:20.978657 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:20.978631 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l7xgv\"" Apr 21 15:13:20.987364 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:20.986952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dmzt8" Apr 21 15:13:21.550268 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:21.550226 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-6m8bq"] Apr 21 15:13:21.554652 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:21.554608 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod064e9703_0c22_4ecf_8b43_2473e8986b8b.slice/crio-ee168bb62ebbaf4f55f068581ae9507e013bd08de18d6138bebd8eca18c1fc3f WatchSource:0}: Error finding container ee168bb62ebbaf4f55f068581ae9507e013bd08de18d6138bebd8eca18c1fc3f: Status 404 returned error can't find the container with id ee168bb62ebbaf4f55f068581ae9507e013bd08de18d6138bebd8eca18c1fc3f Apr 21 15:13:21.577907 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:21.577862 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dmzt8"] Apr 21 15:13:21.580923 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:21.580897 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578cec2c_16fd_469e_931a_b7cf421795a1.slice/crio-36ae2884ff8c6388b2cb154af5661f8d726e1e46c11ac754ec776d8c0c55c3dd WatchSource:0}: Error finding container 36ae2884ff8c6388b2cb154af5661f8d726e1e46c11ac754ec776d8c0c55c3dd: Status 404 returned error can't find the container with id 36ae2884ff8c6388b2cb154af5661f8d726e1e46c11ac754ec776d8c0c55c3dd Apr 21 15:13:21.979612 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:21.979584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:13:22.491471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:22.491408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dmzt8" event={"ID":"578cec2c-16fd-469e-931a-b7cf421795a1","Type":"ContainerStarted","Data":"36ae2884ff8c6388b2cb154af5661f8d726e1e46c11ac754ec776d8c0c55c3dd"} Apr 21 15:13:22.493144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:22.493088 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcb5c5f8d-mn66d" event={"ID":"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5","Type":"ContainerStarted","Data":"9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a"} Apr 21 15:13:22.494521 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:22.494493 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" event={"ID":"064e9703-0c22-4ecf-8b43-2473e8986b8b","Type":"ContainerStarted","Data":"ee168bb62ebbaf4f55f068581ae9507e013bd08de18d6138bebd8eca18c1fc3f"} Apr 21 15:13:22.513707 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:22.513651 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fcb5c5f8d-mn66d" podStartSLOduration=2.515146891 podStartE2EDuration="5.51363436s" podCreationTimestamp="2026-04-21 15:13:17 +0000 UTC" firstStartedPulling="2026-04-21 15:13:18.44066594 +0000 UTC m=+167.036236798" lastFinishedPulling="2026-04-21 15:13:21.439153405 +0000 UTC m=+170.034724267" observedRunningTime="2026-04-21 15:13:22.511535872 +0000 UTC m=+171.107106749" watchObservedRunningTime="2026-04-21 15:13:22.51363436 +0000 UTC m=+171.109205236" Apr 21 15:13:28.292974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:28.292932 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:28.292974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:28.292984 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:28.294593 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:28.294561 2575 patch_prober.go:28] interesting pod/console-6fcb5c5f8d-mn66d container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 21 15:13:28.294762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:28.294616 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6fcb5c5f8d-mn66d" podUID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 21 15:13:28.471704 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:28.471661 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fr77k" Apr 21 15:13:30.864190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:30.864115 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fcb5c5f8d-mn66d"] Apr 21 15:13:31.523329 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.523284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-t25vf" event={"ID":"abead490-d43d-4f40-bf13-41e9c8573f7f","Type":"ContainerStarted","Data":"640f1ec6778051da729da3ae19b60f24156ba81b0aab2e6f166491665f325d6b"} Apr 21 15:13:31.523545 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.523499 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:31.524857 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.524826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dmzt8" event={"ID":"578cec2c-16fd-469e-931a-b7cf421795a1","Type":"ContainerStarted","Data":"155d67d3f0dd1353b8094886af481c6ecd963ef7062b34c1557195f914318075"} Apr 21 15:13:31.526775 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.526750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" event={"ID":"064e9703-0c22-4ecf-8b43-2473e8986b8b","Type":"ContainerStarted","Data":"ed3b26560a741668a137c66affe99dcc9e1fe133bab199bb8fe20cdc467fd465"} Apr 21 15:13:31.526865 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.526781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" event={"ID":"064e9703-0c22-4ecf-8b43-2473e8986b8b","Type":"ContainerStarted","Data":"9abc93040697386472599e303ae718de29ec1846089814115c022c76687db476"} Apr 21 15:13:31.538273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.538246 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-t25vf" Apr 21 15:13:31.543149 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.543106 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-t25vf" podStartSLOduration=2.432751364 podStartE2EDuration="19.543094016s" podCreationTimestamp="2026-04-21 15:13:12 +0000 UTC" firstStartedPulling="2026-04-21 15:13:13.4298285 +0000 UTC m=+162.025399354" lastFinishedPulling="2026-04-21 15:13:30.540171153 +0000 UTC m=+179.135742006" observedRunningTime="2026-04-21 15:13:31.541700623 +0000 UTC m=+180.137271501" watchObservedRunningTime="2026-04-21 15:13:31.543094016 +0000 UTC m=+180.138664890" Apr 21 15:13:31.559797 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.559742 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-6m8bq" podStartSLOduration=4.620800043 podStartE2EDuration="13.559726741s" podCreationTimestamp="2026-04-21 15:13:18 +0000 UTC" firstStartedPulling="2026-04-21 15:13:21.556523618 +0000 UTC m=+170.152094487" lastFinishedPulling="2026-04-21 15:13:30.49545033 +0000 UTC m=+179.091021185" observedRunningTime="2026-04-21 15:13:31.559121689 +0000 UTC m=+180.154692560" watchObservedRunningTime="2026-04-21 15:13:31.559726741 +0000 UTC m=+180.155297617" Apr 21 15:13:31.576074 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:31.576033 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dmzt8" podStartSLOduration=137.661071081 podStartE2EDuration="2m26.576021158s" podCreationTimestamp="2026-04-21 15:11:05 +0000 UTC" firstStartedPulling="2026-04-21 15:13:21.582727586 +0000 UTC m=+170.178298456" lastFinishedPulling="2026-04-21 15:13:30.497677679 +0000 UTC m=+179.093248533" observedRunningTime="2026-04-21 15:13:31.574734721 +0000 UTC m=+180.170305597" watchObservedRunningTime="2026-04-21 15:13:31.576021158 +0000 UTC m=+180.171592032" Apr 21 15:13:33.173443 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.173407 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5rlzv"] Apr 21 15:13:33.204919 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.204884 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h5wdv"] Apr 21 15:13:33.205189 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.205129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.209347 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.209323 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-rlfrl\"" Apr 21 15:13:33.210019 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.209757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 15:13:33.210144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.210056 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 15:13:33.210298 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.210220 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 15:13:33.221878 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.220940 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5rlzv"] Apr 21 15:13:33.221878 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.221060 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.226676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.226496 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 15:13:33.227353 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.226911 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 15:13:33.227353 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.227213 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 15:13:33.227991 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.227831 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qhrt9\"" Apr 21 15:13:33.276115 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.276273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.276273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-textfile\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276209 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f72e9f53-15f9-4ca0-9463-60b025086a02-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.276273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-sys\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276273 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-root\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f72e9f53-15f9-4ca0-9463-60b025086a02-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wrwx\" (UniqueName: \"kubernetes.io/projected/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-api-access-5wrwx\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-tls\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4km6\" (UniqueName: \"kubernetes.io/projected/387913de-81bd-4750-b6c9-7e10d0d68401-kube-api-access-p4km6\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276561 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/387913de-81bd-4750-b6c9-7e10d0d68401-metrics-client-ca\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.276931 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.276574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-wtmp\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377241 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-sys\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-root\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f72e9f53-15f9-4ca0-9463-60b025086a02-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-root\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377407 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wrwx\" (UniqueName: \"kubernetes.io/projected/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-api-access-5wrwx\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-tls\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4km6\" (UniqueName: \"kubernetes.io/projected/387913de-81bd-4750-b6c9-7e10d0d68401-kube-api-access-p4km6\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/387913de-81bd-4750-b6c9-7e10d0d68401-metrics-client-ca\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-wtmp\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-textfile\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.377792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f72e9f53-15f9-4ca0-9463-60b025086a02-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.378163 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-sys\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.378163 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:33.377878 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 21 15:13:33.378163 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.377915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f72e9f53-15f9-4ca0-9463-60b025086a02-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.378163 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:33.377942 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-tls podName:f72e9f53-15f9-4ca0-9463-60b025086a02 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:33.877922248 +0000 UTC m=+182.473493106 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-5rlzv" (UID: "f72e9f53-15f9-4ca0-9463-60b025086a02") : secret "kube-state-metrics-tls" not found Apr 21 15:13:33.378163 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.378056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-wtmp\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.378163 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.378155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-textfile\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.378465 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:33.378338 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 15:13:33.378465 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.378355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/387913de-81bd-4750-b6c9-7e10d0d68401-metrics-client-ca\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.378465 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:33.378401 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-tls podName:387913de-81bd-4750-b6c9-7e10d0d68401 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:33.87836356 +0000 UTC m=+182.473934417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-tls") pod "node-exporter-h5wdv" (UID: "387913de-81bd-4750-b6c9-7e10d0d68401") : secret "node-exporter-tls" not found Apr 21 15:13:33.378608 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.378481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.378873 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.378849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-accelerators-collector-config\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.379173 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.379149 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f72e9f53-15f9-4ca0-9463-60b025086a02-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.381844 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.381821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.381987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.381920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.388406 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.388362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wrwx\" (UniqueName: \"kubernetes.io/projected/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-api-access-5wrwx\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.388500 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.388454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4km6\" (UniqueName: \"kubernetes.io/projected/387913de-81bd-4750-b6c9-7e10d0d68401-kube-api-access-p4km6\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.882179 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.882140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-tls\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.882488 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.882208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.885029 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.884998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387913de-81bd-4750-b6c9-7e10d0d68401-node-exporter-tls\") pod \"node-exporter-h5wdv\" (UID: \"387913de-81bd-4750-b6c9-7e10d0d68401\") " pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:33.885159 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.885125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f72e9f53-15f9-4ca0-9463-60b025086a02-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5rlzv\" (UID: \"f72e9f53-15f9-4ca0-9463-60b025086a02\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:33.922449 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.922400 2575 patch_prober.go:28] interesting pod/image-registry-85fb9cc7f7-zl6lq container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 15:13:33.922617 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:33.922482 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 15:13:34.124098 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.124055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" Apr 21 15:13:34.136663 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.136627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h5wdv" Apr 21 15:13:34.149041 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:34.148949 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod387913de_81bd_4750_b6c9_7e10d0d68401.slice/crio-ef0d2bb1384ecc38277d05957d82c90c4193aebb9378fbdb01bc19b19ea23e6c WatchSource:0}: Error finding container ef0d2bb1384ecc38277d05957d82c90c4193aebb9378fbdb01bc19b19ea23e6c: Status 404 returned error can't find the container with id ef0d2bb1384ecc38277d05957d82c90c4193aebb9378fbdb01bc19b19ea23e6c Apr 21 15:13:34.306171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.306138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5rlzv"] Apr 21 15:13:34.310655 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:34.310621 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf72e9f53_15f9_4ca0_9463_60b025086a02.slice/crio-50249907e935fc9b947429243a0e1e484bbaec4f3fed80d4ceb026f10a433ee6 WatchSource:0}: Error finding container 50249907e935fc9b947429243a0e1e484bbaec4f3fed80d4ceb026f10a433ee6: Status 404 returned error can't find the container with id 50249907e935fc9b947429243a0e1e484bbaec4f3fed80d4ceb026f10a433ee6 Apr 21 15:13:34.321951 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.321920 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:13:34.350109 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.349439 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:13:34.350109 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.349631 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352413 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352450 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352567 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352644 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352817 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352890 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6cgth\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.352984 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.353039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 15:13:34.353812 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.353097 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 15:13:34.387300 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpt5m\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-kube-api-access-vpt5m\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387300 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387268 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-out\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387507 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387301 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387507 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-web-config\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387507 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387507 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387507 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387580 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-volume\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387887 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.387887 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.387776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-web-config\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489187 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489334 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-volume\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489473 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpt5m\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-kube-api-access-vpt5m\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-out\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.489564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:34.489676 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle podName:0626f62a-710b-42dd-b9e2-c6d6b5e6f366 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:34.989650328 +0000 UTC m=+183.585221182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366") : configmap references non-existent config key: ca-bundle.crt Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:34.489778 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 21 15:13:34.490559 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:34.489823 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls podName:0626f62a-710b-42dd-b9e2-c6d6b5e6f366 nodeName:}" failed. No retries permitted until 2026-04-21 15:13:34.989809719 +0000 UTC m=+183.585380574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366") : secret "alertmanager-main-tls" not found Apr 21 15:13:34.491484 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.490204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.491484 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.490273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.492581 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.492555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.493960 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.493934 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-web-config\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.494072 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.493975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-out\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.494072 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.493998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.494295 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.494276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.494421 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.494396 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.494862 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.494736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.495125 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.495085 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-volume\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.512857 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.512833 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpt5m\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-kube-api-access-vpt5m\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.538074 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.538038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5wdv" event={"ID":"387913de-81bd-4750-b6c9-7e10d0d68401","Type":"ContainerStarted","Data":"ef0d2bb1384ecc38277d05957d82c90c4193aebb9378fbdb01bc19b19ea23e6c"} Apr 21 15:13:34.539363 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.539331 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" event={"ID":"f72e9f53-15f9-4ca0-9463-60b025086a02","Type":"ContainerStarted","Data":"50249907e935fc9b947429243a0e1e484bbaec4f3fed80d4ceb026f10a433ee6"} Apr 21 15:13:34.995608 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.995051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.995608 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.995139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:34.996884 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:34.996832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:35.005476 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:35.005433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:35.262789 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:35.262706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:13:35.301473 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:35.301333 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85fb9cc7f7-zl6lq"] Apr 21 15:13:35.307495 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:35.307450 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:13:35.429549 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:35.429508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:13:35.581892 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:35.581812 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0626f62a_710b_42dd_b9e2_c6d6b5e6f366.slice/crio-37f7d68ff26d326bfab252d02079a6f4582fb101772d1f89cd3e0b6b08cca73c WatchSource:0}: Error finding container 37f7d68ff26d326bfab252d02079a6f4582fb101772d1f89cd3e0b6b08cca73c: Status 404 returned error can't find the container with id 37f7d68ff26d326bfab252d02079a6f4582fb101772d1f89cd3e0b6b08cca73c Apr 21 15:13:36.551086 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:36.551050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" event={"ID":"f72e9f53-15f9-4ca0-9463-60b025086a02","Type":"ContainerStarted","Data":"5e0f173dc990a246192c5d839d072977fc354e23c2281b56f107f0165e32f8f1"} Apr 21 15:13:36.554322 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:36.554294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"37f7d68ff26d326bfab252d02079a6f4582fb101772d1f89cd3e0b6b08cca73c"} Apr 21 15:13:36.558525 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:36.557606 2575 generic.go:358] "Generic (PLEG): container finished" podID="387913de-81bd-4750-b6c9-7e10d0d68401" containerID="8426bd9ef6459557df2bf1c2ceca4764c1d14b8558e30f6d8e8cfe65d01e979e" exitCode=0 Apr 21 15:13:36.558525 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:36.557672 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5wdv" event={"ID":"387913de-81bd-4750-b6c9-7e10d0d68401","Type":"ContainerDied","Data":"8426bd9ef6459557df2bf1c2ceca4764c1d14b8558e30f6d8e8cfe65d01e979e"} Apr 21 15:13:37.563643 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.563602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5wdv" event={"ID":"387913de-81bd-4750-b6c9-7e10d0d68401","Type":"ContainerStarted","Data":"efec17004893c32a3ff0c697433b3f1a6926e1e9642543e26f9218a4f74587ea"} Apr 21 15:13:37.564082 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.563651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h5wdv" event={"ID":"387913de-81bd-4750-b6c9-7e10d0d68401","Type":"ContainerStarted","Data":"ccacdde2cd0685a4827e1197ffeea559f213d6ad46160448ab144c103abb72e2"} Apr 21 15:13:37.565985 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.565956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" event={"ID":"f72e9f53-15f9-4ca0-9463-60b025086a02","Type":"ContainerStarted","Data":"f49a0ef58d249f19fd166d8c45f3847737b97c6071d86968da2f467de2252089"} Apr 21 15:13:37.566093 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.565985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" event={"ID":"f72e9f53-15f9-4ca0-9463-60b025086a02","Type":"ContainerStarted","Data":"95db8efb1285ec9d91c2939dff604b8452a64d0627c589d1fac3af652cdc1ab5"} Apr 21 15:13:37.585939 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.585882 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h5wdv" podStartSLOduration=2.716066822 podStartE2EDuration="4.585866048s" podCreationTimestamp="2026-04-21 15:13:33 +0000 UTC" firstStartedPulling="2026-04-21 15:13:34.151517709 +0000 UTC m=+182.747088578" lastFinishedPulling="2026-04-21 15:13:36.021316929 +0000 UTC m=+184.616887804" observedRunningTime="2026-04-21 15:13:37.585536553 +0000 UTC m=+186.181107427" watchObservedRunningTime="2026-04-21 15:13:37.585866048 +0000 UTC m=+186.181436923" Apr 21 15:13:37.614976 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.614932 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-5rlzv" podStartSLOduration=2.572043711 podStartE2EDuration="4.614920692s" podCreationTimestamp="2026-04-21 15:13:33 +0000 UTC" firstStartedPulling="2026-04-21 15:13:34.313123118 +0000 UTC m=+182.908693970" lastFinishedPulling="2026-04-21 15:13:36.356000092 +0000 UTC m=+184.951570951" observedRunningTime="2026-04-21 15:13:37.612650867 +0000 UTC m=+186.208221746" watchObservedRunningTime="2026-04-21 15:13:37.614920692 +0000 UTC m=+186.210491567" Apr 21 15:13:37.685576 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.685531 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7fd774b49d-59zfj"] Apr 21 15:13:37.708344 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.708312 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7fd774b49d-59zfj"] Apr 21 15:13:37.708491 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.708464 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.711353 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.711329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 15:13:37.711517 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.711363 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 15:13:37.711517 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.711472 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-4eedieh4cra2t\"" Apr 21 15:13:37.711517 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.711472 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 15:13:37.711673 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.711329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-s8gvs\"" Apr 21 15:13:37.711673 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.711329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 15:13:37.822593 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-secret-metrics-server-client-certs\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.822762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.822762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4fz\" (UniqueName: \"kubernetes.io/projected/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-kube-api-access-td4fz\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.822762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-metrics-server-audit-profiles\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.822762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-client-ca-bundle\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.822920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822808 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-audit-log\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.822920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.822873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-secret-metrics-server-tls\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924436 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-secret-metrics-server-client-certs\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924600 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924600 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td4fz\" (UniqueName: \"kubernetes.io/projected/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-kube-api-access-td4fz\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924600 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-metrics-server-audit-profiles\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924600 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-client-ca-bundle\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924799 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-audit-log\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.924867 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.924821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-secret-metrics-server-tls\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.925279 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.925244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.925599 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.925571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-audit-log\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.927983 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.927939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-secret-metrics-server-client-certs\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.928212 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.928173 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-secret-metrics-server-tls\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.928334 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.928304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-client-ca-bundle\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.936611 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.936584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-metrics-server-audit-profiles\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:37.940937 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:37.940914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4fz\" (UniqueName: \"kubernetes.io/projected/c9f97ac5-48ff-4c64-a04a-cb1d469f81ed-kube-api-access-td4fz\") pod \"metrics-server-7fd774b49d-59zfj\" (UID: \"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed\") " pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:38.020200 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.020151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:38.197577 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.197528 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7fd774b49d-59zfj"] Apr 21 15:13:38.201018 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:38.200983 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f97ac5_48ff_4c64_a04a_cb1d469f81ed.slice/crio-048f2ab9602173167afbec89d63fe002c416f220f5b53edec8a7fee0d75ffe4d WatchSource:0}: Error finding container 048f2ab9602173167afbec89d63fe002c416f220f5b53edec8a7fee0d75ffe4d: Status 404 returned error can't find the container with id 048f2ab9602173167afbec89d63fe002c416f220f5b53edec8a7fee0d75ffe4d Apr 21 15:13:38.365879 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.365793 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58bcf894bd-d59tb"] Apr 21 15:13:38.407538 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.407495 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bcf894bd-d59tb"] Apr 21 15:13:38.407538 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.407531 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.416733 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.416706 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 15:13:38.439172 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-service-ca\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.439354 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-oauth-config\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.439354 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-trusted-ca-bundle\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.439464 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-config\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.439503 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-serving-cert\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.439503 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srrs7\" (UniqueName: \"kubernetes.io/projected/bd3e92c4-7626-48a3-973f-59eb6e889bb2-kube-api-access-srrs7\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.439589 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.439519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-oauth-serving-cert\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540280 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-service-ca\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540462 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-oauth-config\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540462 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-trusted-ca-bundle\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540462 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540361 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-config\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540462 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-serving-cert\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540692 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540466 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srrs7\" (UniqueName: \"kubernetes.io/projected/bd3e92c4-7626-48a3-973f-59eb6e889bb2-kube-api-access-srrs7\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.540692 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.540616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-oauth-serving-cert\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.541230 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.541079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-config\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.541345 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.541327 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-oauth-serving-cert\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.541847 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.541820 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-service-ca\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.542004 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.541900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-trusted-ca-bundle\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.543348 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.543316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-serving-cert\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.543458 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.543404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-oauth-config\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.550307 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.550287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srrs7\" (UniqueName: \"kubernetes.io/projected/bd3e92c4-7626-48a3-973f-59eb6e889bb2-kube-api-access-srrs7\") pod \"console-58bcf894bd-d59tb\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.571544 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.571502 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" event={"ID":"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed","Type":"ContainerStarted","Data":"048f2ab9602173167afbec89d63fe002c416f220f5b53edec8a7fee0d75ffe4d"} Apr 21 15:13:38.573146 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.573113 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" exitCode=0 Apr 21 15:13:38.573555 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.573522 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} Apr 21 15:13:38.719861 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.719816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:38.874741 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:38.874708 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58bcf894bd-d59tb"] Apr 21 15:13:38.878627 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:38.878591 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd3e92c4_7626_48a3_973f_59eb6e889bb2.slice/crio-1a2117ff622a9310653a54ca602e7d0b5d004821d338eaf34acdc37cf89d1660 WatchSource:0}: Error finding container 1a2117ff622a9310653a54ca602e7d0b5d004821d338eaf34acdc37cf89d1660: Status 404 returned error can't find the container with id 1a2117ff622a9310653a54ca602e7d0b5d004821d338eaf34acdc37cf89d1660 Apr 21 15:13:39.580113 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:39.580072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bcf894bd-d59tb" event={"ID":"bd3e92c4-7626-48a3-973f-59eb6e889bb2","Type":"ContainerStarted","Data":"e1b52da2dbc7bacfe5df43c6b0ee1ba5c8f4be3bc67008e8854286d3c8c64a50"} Apr 21 15:13:39.580817 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:39.580122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bcf894bd-d59tb" event={"ID":"bd3e92c4-7626-48a3-973f-59eb6e889bb2","Type":"ContainerStarted","Data":"1a2117ff622a9310653a54ca602e7d0b5d004821d338eaf34acdc37cf89d1660"} Apr 21 15:13:39.610628 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:39.608444 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58bcf894bd-d59tb" podStartSLOduration=1.608424806 podStartE2EDuration="1.608424806s" podCreationTimestamp="2026-04-21 15:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:13:39.608323896 +0000 UTC m=+188.203894776" watchObservedRunningTime="2026-04-21 15:13:39.608424806 +0000 UTC m=+188.203995682" Apr 21 15:13:41.587782 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:41.587753 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" event={"ID":"c9f97ac5-48ff-4c64-a04a-cb1d469f81ed","Type":"ContainerStarted","Data":"feaaa6d35960f79eae14b2e2a6cc61de7d5c3a252c5ddcd2447a1376a77858cc"} Apr 21 15:13:41.590217 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:41.590195 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} Apr 21 15:13:41.590308 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:41.590221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} Apr 21 15:13:41.590308 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:41.590231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} Apr 21 15:13:41.590308 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:41.590239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} Apr 21 15:13:41.606116 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:41.606075 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" podStartSLOduration=1.885945268 podStartE2EDuration="4.606062118s" podCreationTimestamp="2026-04-21 15:13:37 +0000 UTC" firstStartedPulling="2026-04-21 15:13:38.20343058 +0000 UTC m=+186.799001433" lastFinishedPulling="2026-04-21 15:13:40.923547415 +0000 UTC m=+189.519118283" observedRunningTime="2026-04-21 15:13:41.604425421 +0000 UTC m=+190.199996296" watchObservedRunningTime="2026-04-21 15:13:41.606062118 +0000 UTC m=+190.201632984" Apr 21 15:13:42.598171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:42.598128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} Apr 21 15:13:43.460708 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.460669 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58bcf894bd-d59tb"] Apr 21 15:13:43.489569 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.489541 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d85b7f7b6-jzl76"] Apr 21 15:13:43.516486 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.516458 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d85b7f7b6-jzl76"] Apr 21 15:13:43.516628 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.516569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601523 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz64l\" (UniqueName: \"kubernetes.io/projected/9dc0fad1-319d-4a38-a347-924981ab27d0-kube-api-access-jz64l\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601535 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-serving-cert\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601585 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-oauth-serving-cert\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-console-config\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-oauth-config\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-service-ca\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.601962 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.601793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-trusted-ca-bundle\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.604420 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.604363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerStarted","Data":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} Apr 21 15:13:43.633433 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.633361 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.406075616 podStartE2EDuration="9.633347032s" podCreationTimestamp="2026-04-21 15:13:34 +0000 UTC" firstStartedPulling="2026-04-21 15:13:35.610446116 +0000 UTC m=+184.206016971" lastFinishedPulling="2026-04-21 15:13:42.837717531 +0000 UTC m=+191.433288387" observedRunningTime="2026-04-21 15:13:43.631140006 +0000 UTC m=+192.226710881" watchObservedRunningTime="2026-04-21 15:13:43.633347032 +0000 UTC m=+192.228917904" Apr 21 15:13:43.702824 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.702791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz64l\" (UniqueName: \"kubernetes.io/projected/9dc0fad1-319d-4a38-a347-924981ab27d0-kube-api-access-jz64l\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703005 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.702841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-serving-cert\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703005 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.702860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-oauth-serving-cert\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703128 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-console-config\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703128 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703116 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-oauth-config\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703240 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-service-ca\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703240 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-trusted-ca-bundle\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703725 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-oauth-serving-cert\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703971 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-console-config\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.703971 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.703885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-service-ca\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.704080 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.704023 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-trusted-ca-bundle\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.705592 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.705563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-oauth-config\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.706593 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.706571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-serving-cert\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.711594 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.711548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz64l\" (UniqueName: \"kubernetes.io/projected/9dc0fad1-319d-4a38-a347-924981ab27d0-kube-api-access-jz64l\") pod \"console-5d85b7f7b6-jzl76\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.825995 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.825959 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:43.963792 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:43.963767 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d85b7f7b6-jzl76"] Apr 21 15:13:43.966300 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:13:43.966277 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc0fad1_319d_4a38_a347_924981ab27d0.slice/crio-0f3e8c393aff02f3b64bdd559be1b54a5a7984be7cabba58901a8e8251a8f4c2 WatchSource:0}: Error finding container 0f3e8c393aff02f3b64bdd559be1b54a5a7984be7cabba58901a8e8251a8f4c2: Status 404 returned error can't find the container with id 0f3e8c393aff02f3b64bdd559be1b54a5a7984be7cabba58901a8e8251a8f4c2 Apr 21 15:13:44.609359 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:44.609320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d85b7f7b6-jzl76" event={"ID":"9dc0fad1-319d-4a38-a347-924981ab27d0","Type":"ContainerStarted","Data":"5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46"} Apr 21 15:13:44.609721 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:44.609367 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d85b7f7b6-jzl76" event={"ID":"9dc0fad1-319d-4a38-a347-924981ab27d0","Type":"ContainerStarted","Data":"0f3e8c393aff02f3b64bdd559be1b54a5a7984be7cabba58901a8e8251a8f4c2"} Apr 21 15:13:44.641993 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:44.641946 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d85b7f7b6-jzl76" podStartSLOduration=1.64192776 podStartE2EDuration="1.64192776s" podCreationTimestamp="2026-04-21 15:13:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:13:44.641272272 +0000 UTC m=+193.236843146" watchObservedRunningTime="2026-04-21 15:13:44.64192776 +0000 UTC m=+193.237498637" Apr 21 15:13:48.720263 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:48.720223 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:13:53.826828 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:53.826788 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:53.826828 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:53.826831 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:53.831641 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:53.831617 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:54.644828 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:54.644799 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:13:55.888078 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:55.888038 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fcb5c5f8d-mn66d" podUID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" containerName="console" containerID="cri-o://9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a" gracePeriod=15 Apr 21 15:13:56.155870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.155845 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fcb5c5f8d-mn66d_c1af4f4a-515b-4c5b-966c-9bfea44cd4b5/console/0.log" Apr 21 15:13:56.155968 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.155913 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:56.330040 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330010 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4skj\" (UniqueName: \"kubernetes.io/projected/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-kube-api-access-q4skj\") pod \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " Apr 21 15:13:56.330223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330050 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-oauth-config\") pod \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " Apr 21 15:13:56.330223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330137 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-serving-cert\") pod \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " Apr 21 15:13:56.330223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330173 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-service-ca\") pod \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " Apr 21 15:13:56.330223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330203 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-oauth-serving-cert\") pod \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " Apr 21 15:13:56.330223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330217 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-config\") pod \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\" (UID: \"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5\") " Apr 21 15:13:56.330637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330607 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" (UID: "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:13:56.330637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330618 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-service-ca" (OuterVolumeSpecName: "service-ca") pod "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" (UID: "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:13:56.330795 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.330674 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-config" (OuterVolumeSpecName: "console-config") pod "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" (UID: "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:13:56.332513 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.332483 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-kube-api-access-q4skj" (OuterVolumeSpecName: "kube-api-access-q4skj") pod "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" (UID: "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5"). InnerVolumeSpecName "kube-api-access-q4skj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:13:56.332925 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.332908 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" (UID: "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:13:56.332978 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.332933 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" (UID: "c1af4f4a-515b-4c5b-966c-9bfea44cd4b5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:13:56.431242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.431163 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:13:56.431242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.431190 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-service-ca\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:13:56.431242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.431200 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-oauth-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:13:56.431242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.431210 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:13:56.431242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.431218 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q4skj\" (UniqueName: \"kubernetes.io/projected/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-kube-api-access-q4skj\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:13:56.431242 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.431227 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5-console-oauth-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:13:56.648227 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.648193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fcb5c5f8d-mn66d_c1af4f4a-515b-4c5b-966c-9bfea44cd4b5/console/0.log" Apr 21 15:13:56.648418 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.648237 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" containerID="9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a" exitCode=2 Apr 21 15:13:56.648418 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.648271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcb5c5f8d-mn66d" event={"ID":"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5","Type":"ContainerDied","Data":"9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a"} Apr 21 15:13:56.648418 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.648299 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fcb5c5f8d-mn66d" Apr 21 15:13:56.648418 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.648312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fcb5c5f8d-mn66d" event={"ID":"c1af4f4a-515b-4c5b-966c-9bfea44cd4b5","Type":"ContainerDied","Data":"61b43b4a84ea985cd241b6ec28a1b18e676df556f2ad8f9a0a4cbf3866bf3e6d"} Apr 21 15:13:56.648418 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.648328 2575 scope.go:117] "RemoveContainer" containerID="9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a" Apr 21 15:13:56.656828 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.656811 2575 scope.go:117] "RemoveContainer" containerID="9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a" Apr 21 15:13:56.657091 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:13:56.657072 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a\": container with ID starting with 9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a not found: ID does not exist" containerID="9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a" Apr 21 15:13:56.657191 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.657099 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a"} err="failed to get container status \"9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a\": rpc error: code = NotFound desc = could not find container \"9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a\": container with ID starting with 9c863311a5db4055a8a952d8a700efcec00f08d965bc2a80833d738d42e9278a not found: ID does not exist" Apr 21 15:13:56.668975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.668955 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fcb5c5f8d-mn66d"] Apr 21 15:13:56.674267 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:56.674245 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fcb5c5f8d-mn66d"] Apr 21 15:13:57.981650 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:57.981616 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" path="/var/lib/kubelet/pods/c1af4f4a-515b-4c5b-966c-9bfea44cd4b5/volumes" Apr 21 15:13:58.020791 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:58.020771 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:13:58.020791 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:13:58.020796 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:14:00.329254 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.329188 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" containerName="registry" containerID="cri-o://3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f" gracePeriod=30 Apr 21 15:14:00.589226 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.589167 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:14:00.663324 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.663288 2575 generic.go:358] "Generic (PLEG): container finished" podID="da72ba61-5c83-47ce-a285-a58cd7b77246" containerID="3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f" exitCode=0 Apr 21 15:14:00.663537 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.663357 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" Apr 21 15:14:00.663537 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.663390 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" event={"ID":"da72ba61-5c83-47ce-a285-a58cd7b77246","Type":"ContainerDied","Data":"3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f"} Apr 21 15:14:00.663537 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.663440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-85fb9cc7f7-zl6lq" event={"ID":"da72ba61-5c83-47ce-a285-a58cd7b77246","Type":"ContainerDied","Data":"f391ca4ea8a23ee78c1c16adae2580ae77cf82628b33d79e1a6f89c8f952225b"} Apr 21 15:14:00.663537 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.663462 2575 scope.go:117] "RemoveContainer" containerID="3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f" Apr 21 15:14:00.669854 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.669829 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-installation-pull-secrets\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.669961 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.669867 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-image-registry-private-configuration\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.669961 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.669905 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da72ba61-5c83-47ce-a285-a58cd7b77246-ca-trust-extracted\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.669961 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.669950 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.670107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.669980 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-certificates\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.670107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.670009 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-bound-sa-token\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.670107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.670057 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7kmm\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-kube-api-access-n7kmm\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.670246 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.670122 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-trusted-ca\") pod \"da72ba61-5c83-47ce-a285-a58cd7b77246\" (UID: \"da72ba61-5c83-47ce-a285-a58cd7b77246\") " Apr 21 15:14:00.671003 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.670872 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:00.671003 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.670920 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:00.672342 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.672050 2575 scope.go:117] "RemoveContainer" containerID="3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f" Apr 21 15:14:00.672544 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:00.672400 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f\": container with ID starting with 3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f not found: ID does not exist" containerID="3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f" Apr 21 15:14:00.672544 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.672437 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f"} err="failed to get container status \"3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f\": rpc error: code = NotFound desc = could not find container \"3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f\": container with ID starting with 3c9cab533826ad0d04e6ff0a7d395dc59bd539c8567e029b2501810c022b5d6f not found: ID does not exist" Apr 21 15:14:00.673040 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.673017 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:00.673129 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.673098 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:00.673328 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.673312 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:00.673416 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.673328 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:00.673477 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.673432 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-kube-api-access-n7kmm" (OuterVolumeSpecName: "kube-api-access-n7kmm") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "kube-api-access-n7kmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:00.679637 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.679616 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da72ba61-5c83-47ce-a285-a58cd7b77246-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "da72ba61-5c83-47ce-a285-a58cd7b77246" (UID: "da72ba61-5c83-47ce-a285-a58cd7b77246"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:14:00.770985 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.770953 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n7kmm\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-kube-api-access-n7kmm\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.770985 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.770982 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-trusted-ca\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.770985 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.770993 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-installation-pull-secrets\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.771190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.771003 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/da72ba61-5c83-47ce-a285-a58cd7b77246-image-registry-private-configuration\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.771190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.771013 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/da72ba61-5c83-47ce-a285-a58cd7b77246-ca-trust-extracted\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.771190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.771024 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-tls\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.771190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.771033 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/da72ba61-5c83-47ce-a285-a58cd7b77246-registry-certificates\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.771190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.771042 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da72ba61-5c83-47ce-a285-a58cd7b77246-bound-sa-token\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:00.984830 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.984798 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-85fb9cc7f7-zl6lq"] Apr 21 15:14:00.988771 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:00.988746 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-85fb9cc7f7-zl6lq"] Apr 21 15:14:01.979473 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:01.979443 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" path="/var/lib/kubelet/pods/da72ba61-5c83-47ce-a285-a58cd7b77246/volumes" Apr 21 15:14:08.479993 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.479950 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58bcf894bd-d59tb" podUID="bd3e92c4-7626-48a3-973f-59eb6e889bb2" containerName="console" containerID="cri-o://e1b52da2dbc7bacfe5df43c6b0ee1ba5c8f4be3bc67008e8854286d3c8c64a50" gracePeriod=15 Apr 21 15:14:08.688705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.688679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bcf894bd-d59tb_bd3e92c4-7626-48a3-973f-59eb6e889bb2/console/0.log" Apr 21 15:14:08.688851 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.688721 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd3e92c4-7626-48a3-973f-59eb6e889bb2" containerID="e1b52da2dbc7bacfe5df43c6b0ee1ba5c8f4be3bc67008e8854286d3c8c64a50" exitCode=2 Apr 21 15:14:08.688851 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.688801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bcf894bd-d59tb" event={"ID":"bd3e92c4-7626-48a3-973f-59eb6e889bb2","Type":"ContainerDied","Data":"e1b52da2dbc7bacfe5df43c6b0ee1ba5c8f4be3bc67008e8854286d3c8c64a50"} Apr 21 15:14:08.690223 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.690197 2575 generic.go:358] "Generic (PLEG): container finished" podID="b4f29a84-c932-46e5-8d58-0e2fb5ab05f8" containerID="0982c523f6537da4214019e5a0c65f2035bbc9ba58685eed397c4432373938bf" exitCode=0 Apr 21 15:14:08.690350 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.690247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" event={"ID":"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8","Type":"ContainerDied","Data":"0982c523f6537da4214019e5a0c65f2035bbc9ba58685eed397c4432373938bf"} Apr 21 15:14:08.690614 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.690600 2575 scope.go:117] "RemoveContainer" containerID="0982c523f6537da4214019e5a0c65f2035bbc9ba58685eed397c4432373938bf" Apr 21 15:14:08.747229 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.747197 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bcf894bd-d59tb_bd3e92c4-7626-48a3-973f-59eb6e889bb2/console/0.log" Apr 21 15:14:08.747352 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.747261 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:14:08.840783 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840752 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-config\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.840952 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840806 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-trusted-ca-bundle\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.840952 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840839 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-oauth-config\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.840952 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840867 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srrs7\" (UniqueName: \"kubernetes.io/projected/bd3e92c4-7626-48a3-973f-59eb6e889bb2-kube-api-access-srrs7\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.840952 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840922 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-serving-cert\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.841148 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840962 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-oauth-serving-cert\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.841148 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.840995 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-service-ca\") pod \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\" (UID: \"bd3e92c4-7626-48a3-973f-59eb6e889bb2\") " Apr 21 15:14:08.841241 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.841153 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-config" (OuterVolumeSpecName: "console-config") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:08.841587 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.841302 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:08.841587 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.841506 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:08.841695 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.841645 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:08.841741 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.841694 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-service-ca" (OuterVolumeSpecName: "service-ca") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:08.843442 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.843415 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:08.843442 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.843433 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:08.843689 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.843672 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3e92c4-7626-48a3-973f-59eb6e889bb2-kube-api-access-srrs7" (OuterVolumeSpecName: "kube-api-access-srrs7") pod "bd3e92c4-7626-48a3-973f-59eb6e889bb2" (UID: "bd3e92c4-7626-48a3-973f-59eb6e889bb2"). InnerVolumeSpecName "kube-api-access-srrs7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:08.942493 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.942408 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-trusted-ca-bundle\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:08.942493 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.942438 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-oauth-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:08.942493 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.942458 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-srrs7\" (UniqueName: \"kubernetes.io/projected/bd3e92c4-7626-48a3-973f-59eb6e889bb2-kube-api-access-srrs7\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:08.942493 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.942468 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd3e92c4-7626-48a3-973f-59eb6e889bb2-console-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:08.942493 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.942477 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-oauth-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:08.942493 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:08.942486 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bd3e92c4-7626-48a3-973f-59eb6e889bb2-service-ca\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:09.694919 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.694833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wcz2b" event={"ID":"b4f29a84-c932-46e5-8d58-0e2fb5ab05f8","Type":"ContainerStarted","Data":"0348ee61031164d41e5c971f422ef81c70b1a718d1ff16d64ca0752ffbcdfe3d"} Apr 21 15:14:09.696099 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.696083 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58bcf894bd-d59tb_bd3e92c4-7626-48a3-973f-59eb6e889bb2/console/0.log" Apr 21 15:14:09.696205 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.696163 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58bcf894bd-d59tb" event={"ID":"bd3e92c4-7626-48a3-973f-59eb6e889bb2","Type":"ContainerDied","Data":"1a2117ff622a9310653a54ca602e7d0b5d004821d338eaf34acdc37cf89d1660"} Apr 21 15:14:09.696205 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.696163 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58bcf894bd-d59tb" Apr 21 15:14:09.696279 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.696184 2575 scope.go:117] "RemoveContainer" containerID="e1b52da2dbc7bacfe5df43c6b0ee1ba5c8f4be3bc67008e8854286d3c8c64a50" Apr 21 15:14:09.764630 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.764601 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58bcf894bd-d59tb"] Apr 21 15:14:09.782525 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.782502 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58bcf894bd-d59tb"] Apr 21 15:14:09.979738 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:09.979708 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd3e92c4-7626-48a3-973f-59eb6e889bb2" path="/var/lib/kubelet/pods/bd3e92c4-7626-48a3-973f-59eb6e889bb2/volumes" Apr 21 15:14:14.715278 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:14.715240 2575 generic.go:358] "Generic (PLEG): container finished" podID="54d54903-009c-4bc7-97cd-f27fea133502" containerID="6f7baf21c795f6906b56b8d4a1530bc649c9e4c16d6cde076efd01828b5dc008" exitCode=0 Apr 21 15:14:14.715694 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:14.715315 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" event={"ID":"54d54903-009c-4bc7-97cd-f27fea133502","Type":"ContainerDied","Data":"6f7baf21c795f6906b56b8d4a1530bc649c9e4c16d6cde076efd01828b5dc008"} Apr 21 15:14:14.715694 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:14.715675 2575 scope.go:117] "RemoveContainer" containerID="6f7baf21c795f6906b56b8d4a1530bc649c9e4c16d6cde076efd01828b5dc008" Apr 21 15:14:15.719997 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:15.719961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4nrx" event={"ID":"54d54903-009c-4bc7-97cd-f27fea133502","Type":"ContainerStarted","Data":"b9baa1f5c383f5cd95f6ba98b6b7153d4c59a3d0f3501778e70494fc34b1c870"} Apr 21 15:14:18.026381 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:18.026351 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:14:18.030129 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:18.030107 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7fd774b49d-59zfj" Apr 21 15:14:43.739960 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:43.739915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:14:43.742318 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:43.742296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a17c6c3f-25ab-4414-92a4-946230c882ea-metrics-certs\") pod \"network-metrics-daemon-96snf\" (UID: \"a17c6c3f-25ab-4414-92a4-946230c882ea\") " pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:14:43.882941 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:43.882909 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zqtzq\"" Apr 21 15:14:43.890237 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:43.890218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-96snf" Apr 21 15:14:44.015926 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:44.015902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-96snf"] Apr 21 15:14:44.018530 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:14:44.018500 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda17c6c3f_25ab_4414_92a4_946230c882ea.slice/crio-ad5dedc07caeb97f3c9ceaa29592ae809e5a3e9ea8e16171042af929f485e35e WatchSource:0}: Error finding container ad5dedc07caeb97f3c9ceaa29592ae809e5a3e9ea8e16171042af929f485e35e: Status 404 returned error can't find the container with id ad5dedc07caeb97f3c9ceaa29592ae809e5a3e9ea8e16171042af929f485e35e Apr 21 15:14:44.808870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:44.808831 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96snf" event={"ID":"a17c6c3f-25ab-4414-92a4-946230c882ea","Type":"ContainerStarted","Data":"ad5dedc07caeb97f3c9ceaa29592ae809e5a3e9ea8e16171042af929f485e35e"} Apr 21 15:14:45.813240 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:45.813200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96snf" event={"ID":"a17c6c3f-25ab-4414-92a4-946230c882ea","Type":"ContainerStarted","Data":"9b1d8cc27cc7a54c7bbabaea3e7374f1d2a22a60e6bdc0b4bb0c076d8e6fae36"} Apr 21 15:14:45.813240 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:45.813236 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-96snf" event={"ID":"a17c6c3f-25ab-4414-92a4-946230c882ea","Type":"ContainerStarted","Data":"ca1ede50c9fad0fba9aa0e908c8b2d3ebf5927905d9849f2b72c7355b993bc24"} Apr 21 15:14:45.835268 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:45.835212 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-96snf" podStartSLOduration=252.84696076 podStartE2EDuration="4m13.835193128s" podCreationTimestamp="2026-04-21 15:10:32 +0000 UTC" firstStartedPulling="2026-04-21 15:14:44.020516145 +0000 UTC m=+252.616086999" lastFinishedPulling="2026-04-21 15:14:45.008748513 +0000 UTC m=+253.604319367" observedRunningTime="2026-04-21 15:14:45.834649514 +0000 UTC m=+254.430220390" watchObservedRunningTime="2026-04-21 15:14:45.835193128 +0000 UTC m=+254.430764003" Apr 21 15:14:49.811552 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811517 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b74f6989f-n7465"] Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811815 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd3e92c4-7626-48a3-973f-59eb6e889bb2" containerName="console" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811825 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd3e92c4-7626-48a3-973f-59eb6e889bb2" containerName="console" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811843 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" containerName="console" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811848 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" containerName="console" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811859 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" containerName="registry" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811864 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" containerName="registry" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811909 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="da72ba61-5c83-47ce-a285-a58cd7b77246" containerName="registry" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811919 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd3e92c4-7626-48a3-973f-59eb6e889bb2" containerName="console" Apr 21 15:14:49.811934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.811925 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1af4f4a-515b-4c5b-966c-9bfea44cd4b5" containerName="console" Apr 21 15:14:49.814832 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.814813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.835768 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.835742 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b74f6989f-n7465"] Apr 21 15:14:49.888077 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-console-config\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.888258 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888114 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-trusted-ca-bundle\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.888258 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlv2\" (UniqueName: \"kubernetes.io/projected/19952a24-739d-4153-b21b-c2bb018ef93b-kube-api-access-hjlv2\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.888258 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-oauth-config\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.888446 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888257 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-service-ca\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.888446 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-serving-cert\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.888446 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.888326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-oauth-serving-cert\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989688 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-trusted-ca-bundle\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlv2\" (UniqueName: \"kubernetes.io/projected/19952a24-739d-4153-b21b-c2bb018ef93b-kube-api-access-hjlv2\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-oauth-config\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-service-ca\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-serving-cert\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-oauth-serving-cert\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.989836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.989822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-console-config\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.990582 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.990545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-service-ca\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.990582 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.990545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-console-config\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.990723 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.990653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-trusted-ca-bundle\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.990723 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.990662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-oauth-serving-cert\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.992417 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.992365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-oauth-config\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:49.992555 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:49.992531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-serving-cert\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:50.002191 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:50.002169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlv2\" (UniqueName: \"kubernetes.io/projected/19952a24-739d-4153-b21b-c2bb018ef93b-kube-api-access-hjlv2\") pod \"console-6b74f6989f-n7465\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:50.123508 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:50.123423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:14:50.260998 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:50.260974 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b74f6989f-n7465"] Apr 21 15:14:50.263123 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:14:50.263093 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19952a24_739d_4153_b21b_c2bb018ef93b.slice/crio-5ee18d61bd757c338e8c5c6aea299454c30974b413b986b380433632969d407b WatchSource:0}: Error finding container 5ee18d61bd757c338e8c5c6aea299454c30974b413b986b380433632969d407b: Status 404 returned error can't find the container with id 5ee18d61bd757c338e8c5c6aea299454c30974b413b986b380433632969d407b Apr 21 15:14:50.831124 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:50.831042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b74f6989f-n7465" event={"ID":"19952a24-739d-4153-b21b-c2bb018ef93b","Type":"ContainerStarted","Data":"1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f"} Apr 21 15:14:50.831124 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:50.831077 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b74f6989f-n7465" event={"ID":"19952a24-739d-4153-b21b-c2bb018ef93b","Type":"ContainerStarted","Data":"5ee18d61bd757c338e8c5c6aea299454c30974b413b986b380433632969d407b"} Apr 21 15:14:50.853015 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:50.852964 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b74f6989f-n7465" podStartSLOduration=1.85294969 podStartE2EDuration="1.85294969s" podCreationTimestamp="2026-04-21 15:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:14:50.851701769 +0000 UTC m=+259.447272646" watchObservedRunningTime="2026-04-21 15:14:50.85294969 +0000 UTC m=+259.448520565" Apr 21 15:14:53.868445 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.868409 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:14:53.868855 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.868830 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="alertmanager" containerID="cri-o://40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" gracePeriod=120 Apr 21 15:14:53.868910 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.868891 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-metric" containerID="cri-o://74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" gracePeriod=120 Apr 21 15:14:53.868990 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.868948 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-web" containerID="cri-o://d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" gracePeriod=120 Apr 21 15:14:53.868990 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.868938 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="prom-label-proxy" containerID="cri-o://7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" gracePeriod=120 Apr 21 15:14:53.869158 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.869027 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="config-reloader" containerID="cri-o://cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" gracePeriod=120 Apr 21 15:14:53.869217 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:53.869073 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy" containerID="cri-o://a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" gracePeriod=120 Apr 21 15:14:54.114206 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.114184 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:54.125461 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125410 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-web-config\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125461 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125436 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125615 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125468 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-metrics-client-ca\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125615 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125508 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-main-db\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125615 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125524 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-out\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125615 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125539 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125615 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125566 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-volume\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125615 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125585 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpt5m\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-kube-api-access-vpt5m\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125625 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-cluster-tls-config\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125657 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125689 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-tls-assets\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125724 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-metric\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125756 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-web\") pod \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\" (UID: \"0626f62a-710b-42dd-b9e2-c6d6b5e6f366\") " Apr 21 15:14:54.125898 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125865 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:14:54.126173 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125913 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:54.126173 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.125953 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:14:54.126173 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.126081 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-metrics-client-ca\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.126332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.126222 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-main-db\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.126332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.126267 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.128853 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.128819 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-kube-api-access-vpt5m" (OuterVolumeSpecName: "kube-api-access-vpt5m") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "kube-api-access-vpt5m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:54.129555 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.129520 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.129668 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.129553 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.129668 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.129581 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.129788 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.129757 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-out" (OuterVolumeSpecName: "config-out") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:14:54.129788 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.129759 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:14:54.129894 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.129806 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-volume" (OuterVolumeSpecName: "config-volume") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.130636 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.130612 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.133238 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.133212 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.141789 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.141765 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-web-config" (OuterVolumeSpecName: "web-config") pod "0626f62a-710b-42dd-b9e2-c6d6b5e6f366" (UID: "0626f62a-710b-42dd-b9e2-c6d6b5e6f366"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:14:54.227412 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227352 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-out\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227412 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227406 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-config-volume\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227412 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227417 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vpt5m\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-kube-api-access-vpt5m\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227428 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-cluster-tls-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227436 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-main-tls\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227446 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-tls-assets\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227456 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227465 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227475 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-web-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.227632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.227483 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0626f62a-710b-42dd-b9e2-c6d6b5e6f366-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:14:54.845858 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845826 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" exitCode=0 Apr 21 15:14:54.845858 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845850 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" exitCode=0 Apr 21 15:14:54.845858 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845859 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" exitCode=0 Apr 21 15:14:54.845858 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845865 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" exitCode=0 Apr 21 15:14:54.845858 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845870 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" exitCode=0 Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845876 2575 generic.go:358] "Generic (PLEG): container finished" podID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" exitCode=0 Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845921 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.845993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0626f62a-710b-42dd-b9e2-c6d6b5e6f366","Type":"ContainerDied","Data":"37f7d68ff26d326bfab252d02079a6f4582fb101772d1f89cd3e0b6b08cca73c"} Apr 21 15:14:54.846155 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.846007 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.853427 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.853409 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.860259 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.860241 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.867258 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.867243 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.874619 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.874343 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.876845 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.876823 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:14:54.886332 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.886309 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:14:54.893125 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.893105 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.899870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.899855 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.906235 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.906218 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.906518 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.906497 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.906583 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.906526 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} err="failed to get container status \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" Apr 21 15:14:54.906583 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.906543 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.906796 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.906777 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.906836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.906803 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} err="failed to get container status \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" Apr 21 15:14:54.906836 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.906825 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.907068 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.907052 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.907111 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907074 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} err="failed to get container status \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" Apr 21 15:14:54.907111 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907090 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.907305 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.907290 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.907342 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907311 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} err="failed to get container status \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" Apr 21 15:14:54.907342 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907324 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.907670 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.907650 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.907746 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907672 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} err="failed to get container status \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" Apr 21 15:14:54.907746 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907686 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.907929 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.907914 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.907967 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907933 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} err="failed to get container status \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" Apr 21 15:14:54.907967 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.907946 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.908171 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:14:54.908153 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.908235 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908177 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} err="failed to get container status \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" Apr 21 15:14:54.908235 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908200 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.908464 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908443 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} err="failed to get container status \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" Apr 21 15:14:54.908464 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908462 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.908701 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908683 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} err="failed to get container status \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" Apr 21 15:14:54.908746 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908701 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.908900 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908883 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} err="failed to get container status \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" Apr 21 15:14:54.908937 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.908901 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.909048 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909033 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} err="failed to get container status \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" Apr 21 15:14:54.909090 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909049 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.909258 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909237 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} err="failed to get container status \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" Apr 21 15:14:54.909326 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909259 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.909471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909452 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} err="failed to get container status \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" Apr 21 15:14:54.909525 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909472 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.909712 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909692 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} err="failed to get container status \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" Apr 21 15:14:54.909755 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909713 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.909917 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909899 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} err="failed to get container status \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" Apr 21 15:14:54.909985 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.909918 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.910112 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910097 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} err="failed to get container status \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" Apr 21 15:14:54.910162 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910113 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.910299 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910282 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} err="failed to get container status \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" Apr 21 15:14:54.910340 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910300 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.910533 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910514 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} err="failed to get container status \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" Apr 21 15:14:54.910616 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910534 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.910733 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910717 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} err="failed to get container status \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" Apr 21 15:14:54.910777 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910733 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.910906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910891 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} err="failed to get container status \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" Apr 21 15:14:54.910952 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.910906 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.911069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911054 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} err="failed to get container status \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" Apr 21 15:14:54.911113 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911069 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.911238 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911224 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} err="failed to get container status \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" Apr 21 15:14:54.911320 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911238 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.911440 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911422 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} err="failed to get container status \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" Apr 21 15:14:54.911508 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911443 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.911632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911616 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} err="failed to get container status \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" Apr 21 15:14:54.911693 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911633 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.911825 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911807 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} err="failed to get container status \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" Apr 21 15:14:54.911870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911826 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.912003 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.911988 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} err="failed to get container status \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" Apr 21 15:14:54.912045 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912003 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.912188 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912173 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} err="failed to get container status \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" Apr 21 15:14:54.912228 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912189 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.912358 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912342 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} err="failed to get container status \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" Apr 21 15:14:54.912411 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912357 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.912572 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912555 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} err="failed to get container status \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" Apr 21 15:14:54.912632 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912574 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.912762 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912745 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} err="failed to get container status \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" Apr 21 15:14:54.912805 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912762 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.912934 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912920 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} err="failed to get container status \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" Apr 21 15:14:54.912976 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.912934 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.913097 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913083 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} err="failed to get container status \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" Apr 21 15:14:54.913142 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913097 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.913263 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913247 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} err="failed to get container status \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" Apr 21 15:14:54.913263 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913262 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.913516 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913458 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} err="failed to get container status \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" Apr 21 15:14:54.913516 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913484 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.913732 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913708 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} err="failed to get container status \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" Apr 21 15:14:54.913779 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913732 2575 scope.go:117] "RemoveContainer" containerID="7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37" Apr 21 15:14:54.914021 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913948 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37"} err="failed to get container status \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": rpc error: code = NotFound desc = could not find container \"7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37\": container with ID starting with 7c20c269a078c4ebd4bb807deff7a57a782c79512a47c871d2272af881b34c37 not found: ID does not exist" Apr 21 15:14:54.914021 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.913969 2575 scope.go:117] "RemoveContainer" containerID="74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec" Apr 21 15:14:54.914178 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.914153 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec"} err="failed to get container status \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": rpc error: code = NotFound desc = could not find container \"74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec\": container with ID starting with 74c32e95717b86694a57a8e969ffa7824026705f900697bc0af5e8d411130cec not found: ID does not exist" Apr 21 15:14:54.914178 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.914169 2575 scope.go:117] "RemoveContainer" containerID="a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141" Apr 21 15:14:54.914427 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.914406 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141"} err="failed to get container status \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": rpc error: code = NotFound desc = could not find container \"a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141\": container with ID starting with a5b0ff5531764eb8f8c0ce0713296843dd1bfc98b5ea6a1f39637ad1e0f93141 not found: ID does not exist" Apr 21 15:14:54.914427 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.914426 2575 scope.go:117] "RemoveContainer" containerID="d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d" Apr 21 15:14:54.914706 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.914689 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d"} err="failed to get container status \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": rpc error: code = NotFound desc = could not find container \"d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d\": container with ID starting with d9b646ee47cbe77c5848fa4af606b2e81f51256ce78ccc2f828594926f05d49d not found: ID does not exist" Apr 21 15:14:54.919246 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.914708 2575 scope.go:117] "RemoveContainer" containerID="cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f" Apr 21 15:14:54.919658 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.919634 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f"} err="failed to get container status \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": rpc error: code = NotFound desc = could not find container \"cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f\": container with ID starting with cb4e507912ea1cb2ee445f6368a7b6976410c80b27dfaa03fda9c43600a8618f not found: ID does not exist" Apr 21 15:14:54.919728 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.919659 2575 scope.go:117] "RemoveContainer" containerID="40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9" Apr 21 15:14:54.919891 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.919876 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9"} err="failed to get container status \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": rpc error: code = NotFound desc = could not find container \"40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9\": container with ID starting with 40ccc0a45a60ef0829dc6285a353a513b15d1b0cf1818bfa0f46eb486e757fa9 not found: ID does not exist" Apr 21 15:14:54.919945 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.919892 2575 scope.go:117] "RemoveContainer" containerID="d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec" Apr 21 15:14:54.920110 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.920080 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec"} err="failed to get container status \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": rpc error: code = NotFound desc = could not find container \"d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec\": container with ID starting with d7fdb01dd8d723f5665aadc14662288b35e78b6e6c69463617847ebf90b424ec not found: ID does not exist" Apr 21 15:14:54.924503 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924482 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:14:54.924896 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924881 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy" Apr 21 15:14:54.924946 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924900 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy" Apr 21 15:14:54.924946 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924920 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="alertmanager" Apr 21 15:14:54.924946 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924930 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="alertmanager" Apr 21 15:14:54.924946 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924941 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="prom-label-proxy" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924949 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="prom-label-proxy" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924970 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-web" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924979 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-web" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924991 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-metric" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.924999 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-metric" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925009 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="config-reloader" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925016 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="config-reloader" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925029 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="init-config-reloader" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925036 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="init-config-reloader" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925094 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925103 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="config-reloader" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925111 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="alertmanager" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925117 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-web" Apr 21 15:14:54.925121 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925124 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="kube-rbac-proxy-metric" Apr 21 15:14:54.925528 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.925130 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" containerName="prom-label-proxy" Apr 21 15:14:54.930769 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.930752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:54.933076 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.933056 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 15:14:54.933203 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.933185 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 15:14:54.933267 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.933244 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 15:14:54.933541 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.933528 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 15:14:54.933591 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.933561 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 15:14:54.933894 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.933878 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 15:14:54.934054 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.934037 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 15:14:54.934343 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.934328 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6cgth\"" Apr 21 15:14:54.934432 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.934395 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 15:14:54.938964 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.938945 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 15:14:54.943243 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:54.943223 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:14:55.031819 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-config-out\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-config-volume\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.031987 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.032008 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032022 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.032023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-web-config\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032256 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.032045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032256 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.032071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89nzl\" (UniqueName: \"kubernetes.io/projected/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-kube-api-access-89nzl\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032256 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.032112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.032256 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.032135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-config-volume\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133563 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133935 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-web-config\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133935 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133935 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89nzl\" (UniqueName: \"kubernetes.io/projected/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-kube-api-access-89nzl\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133935 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133935 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.133935 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133909 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-config-out\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.134227 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.134227 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.133965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.134725 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.134605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.136994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-web-config\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-config-out\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-config-volume\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137809 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.137925 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.137904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.139077 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.139053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.145364 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.145341 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89nzl\" (UniqueName: \"kubernetes.io/projected/9ebd8426-ec64-4b31-85fe-7c033ba6ff5c-kube-api-access-89nzl\") pod \"alertmanager-main-0\" (UID: \"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.240358 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.240325 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 15:14:55.373414 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.373386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 15:14:55.375222 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:14:55.375196 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ebd8426_ec64_4b31_85fe_7c033ba6ff5c.slice/crio-f2a280ae6ba595b9c3f7649dc264d36a06d7e61e8749f006045c9b427ff0df32 WatchSource:0}: Error finding container f2a280ae6ba595b9c3f7649dc264d36a06d7e61e8749f006045c9b427ff0df32: Status 404 returned error can't find the container with id f2a280ae6ba595b9c3f7649dc264d36a06d7e61e8749f006045c9b427ff0df32 Apr 21 15:14:55.851678 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.851646 2575 generic.go:358] "Generic (PLEG): container finished" podID="9ebd8426-ec64-4b31-85fe-7c033ba6ff5c" containerID="5ab1d2f30410cc7abac5ea110cc2913ec5505d2281a2460923513bee36355600" exitCode=0 Apr 21 15:14:55.851843 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.851694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerDied","Data":"5ab1d2f30410cc7abac5ea110cc2913ec5505d2281a2460923513bee36355600"} Apr 21 15:14:55.851843 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.851720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"f2a280ae6ba595b9c3f7649dc264d36a06d7e61e8749f006045c9b427ff0df32"} Apr 21 15:14:55.981418 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:55.981389 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0626f62a-710b-42dd-b9e2-c6d6b5e6f366" path="/var/lib/kubelet/pods/0626f62a-710b-42dd-b9e2-c6d6b5e6f366/volumes" Apr 21 15:14:56.858020 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.857985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"a7a32fec837e64ba273c7726c84cb5434faafa2c8950c3ee730c5983ffcd76ef"} Apr 21 15:14:56.858020 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.858025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"4c2e53a7314455c8492cf66b24b251340ffef07b98bdd4804a321f588853abbb"} Apr 21 15:14:56.858271 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.858040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"81181fa2ee74b1cc3f66db65edd12494dfd26c269a771c44219bb1629722d709"} Apr 21 15:14:56.858271 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.858051 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"4815f15e120cd41f49b6a4a8199ee8149165e36a62f6794b95b101ec19758cfb"} Apr 21 15:14:56.858271 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.858061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"e259218e6068a9ee18e9c9ec571d9e55983ef3bcbe4ceef01eeb128af98c174f"} Apr 21 15:14:56.858271 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.858072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"9ebd8426-ec64-4b31-85fe-7c033ba6ff5c","Type":"ContainerStarted","Data":"85977faa3cf1de2f7fdefb66602361ed1de607a8943d4754a0ba9a45ee4a175c"} Apr 21 15:14:56.894642 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:14:56.894543 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.894529026 podStartE2EDuration="2.894529026s" podCreationTimestamp="2026-04-21 15:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:14:56.892169915 +0000 UTC m=+265.487740790" watchObservedRunningTime="2026-04-21 15:14:56.894529026 +0000 UTC m=+265.490099900" Apr 21 15:15:00.124466 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:00.124433 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:15:00.124466 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:00.124469 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:15:00.129134 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:00.129110 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:15:00.874324 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:00.874292 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:15:00.932698 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:00.932664 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d85b7f7b6-jzl76"] Apr 21 15:15:25.954151 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:25.954092 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d85b7f7b6-jzl76" podUID="9dc0fad1-319d-4a38-a347-924981ab27d0" containerName="console" containerID="cri-o://5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46" gracePeriod=15 Apr 21 15:15:26.201551 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.201531 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d85b7f7b6-jzl76_9dc0fad1-319d-4a38-a347-924981ab27d0/console/0.log" Apr 21 15:15:26.201668 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.201590 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:15:26.300069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.299996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz64l\" (UniqueName: \"kubernetes.io/projected/9dc0fad1-319d-4a38-a347-924981ab27d0-kube-api-access-jz64l\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300037 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-trusted-ca-bundle\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300069 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300067 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-console-config\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300293 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300104 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-oauth-config\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300293 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300127 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-serving-cert\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300293 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300154 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-service-ca\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300293 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300196 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-oauth-serving-cert\") pod \"9dc0fad1-319d-4a38-a347-924981ab27d0\" (UID: \"9dc0fad1-319d-4a38-a347-924981ab27d0\") " Apr 21 15:15:26.300671 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300547 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-console-config" (OuterVolumeSpecName: "console-config") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:26.300811 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300739 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:26.300811 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.300749 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-service-ca" (OuterVolumeSpecName: "service-ca") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:26.301118 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.301090 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:15:26.302592 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.302569 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:26.302682 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.302638 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc0fad1-319d-4a38-a347-924981ab27d0-kube-api-access-jz64l" (OuterVolumeSpecName: "kube-api-access-jz64l") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "kube-api-access-jz64l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:15:26.302682 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.302644 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9dc0fad1-319d-4a38-a347-924981ab27d0" (UID: "9dc0fad1-319d-4a38-a347-924981ab27d0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:15:26.401409 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401353 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-service-ca\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.401567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401411 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-oauth-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.401567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401427 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jz64l\" (UniqueName: \"kubernetes.io/projected/9dc0fad1-319d-4a38-a347-924981ab27d0-kube-api-access-jz64l\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.401567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401440 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-trusted-ca-bundle\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.401567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401455 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9dc0fad1-319d-4a38-a347-924981ab27d0-console-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.401567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401469 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-oauth-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.401567 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.401482 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9dc0fad1-319d-4a38-a347-924981ab27d0-console-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:15:26.957775 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.957746 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d85b7f7b6-jzl76_9dc0fad1-319d-4a38-a347-924981ab27d0/console/0.log" Apr 21 15:15:26.958171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.957786 2575 generic.go:358] "Generic (PLEG): container finished" podID="9dc0fad1-319d-4a38-a347-924981ab27d0" containerID="5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46" exitCode=2 Apr 21 15:15:26.958171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.957819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d85b7f7b6-jzl76" event={"ID":"9dc0fad1-319d-4a38-a347-924981ab27d0","Type":"ContainerDied","Data":"5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46"} Apr 21 15:15:26.958171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.957860 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d85b7f7b6-jzl76" Apr 21 15:15:26.958171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.957869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d85b7f7b6-jzl76" event={"ID":"9dc0fad1-319d-4a38-a347-924981ab27d0","Type":"ContainerDied","Data":"0f3e8c393aff02f3b64bdd559be1b54a5a7984be7cabba58901a8e8251a8f4c2"} Apr 21 15:15:26.958171 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.957889 2575 scope.go:117] "RemoveContainer" containerID="5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46" Apr 21 15:15:26.966227 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.966207 2575 scope.go:117] "RemoveContainer" containerID="5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46" Apr 21 15:15:26.966501 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:15:26.966483 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46\": container with ID starting with 5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46 not found: ID does not exist" containerID="5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46" Apr 21 15:15:26.966564 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.966509 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46"} err="failed to get container status \"5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46\": rpc error: code = NotFound desc = could not find container \"5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46\": container with ID starting with 5c766121d56d20a07b6d7d4bc9cac3ad9c51e144a5713eaae0e9bd348e39db46 not found: ID does not exist" Apr 21 15:15:26.981920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:26.981901 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d85b7f7b6-jzl76"] Apr 21 15:15:27.003414 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:27.003362 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d85b7f7b6-jzl76"] Apr 21 15:15:27.981860 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:27.981829 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc0fad1-319d-4a38-a347-924981ab27d0" path="/var/lib/kubelet/pods/9dc0fad1-319d-4a38-a347-924981ab27d0/volumes" Apr 21 15:15:31.862490 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:31.862455 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:15:31.862948 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:31.862759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:15:31.867276 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:31.867250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:15:31.867518 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:31.867503 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:15:31.873583 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:31.873562 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 15:15:44.817387 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.817347 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc"] Apr 21 15:15:44.819816 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.817663 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9dc0fad1-319d-4a38-a347-924981ab27d0" containerName="console" Apr 21 15:15:44.819816 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.817674 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc0fad1-319d-4a38-a347-924981ab27d0" containerName="console" Apr 21 15:15:44.819816 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.817734 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9dc0fad1-319d-4a38-a347-924981ab27d0" containerName="console" Apr 21 15:15:44.820688 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.820673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.823401 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.823358 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:15:44.823562 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.823542 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:15:44.824251 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.824232 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tsr8j\"" Apr 21 15:15:44.833414 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.833394 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc"] Apr 21 15:15:44.857709 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.857679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4dn\" (UniqueName: \"kubernetes.io/projected/c2583f6f-0d2b-4c11-a421-5eb9867ba322-kube-api-access-cg4dn\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.857870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.857734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.857870 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.857824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.958221 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.958189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4dn\" (UniqueName: \"kubernetes.io/projected/c2583f6f-0d2b-4c11-a421-5eb9867ba322-kube-api-access-cg4dn\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.958352 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.958239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.958352 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.958282 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.958686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.958670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.958728 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.958687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:44.967585 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:44.967562 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4dn\" (UniqueName: \"kubernetes.io/projected/c2583f6f-0d2b-4c11-a421-5eb9867ba322-kube-api-access-cg4dn\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:45.130092 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:45.130014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:15:45.251872 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:45.251842 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc"] Apr 21 15:15:45.255407 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:15:45.255340 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2583f6f_0d2b_4c11_a421_5eb9867ba322.slice/crio-e87351a915f1c9d481f1b23e3937880591b63a3bdc54cf69d4e77f12c05ad9cc WatchSource:0}: Error finding container e87351a915f1c9d481f1b23e3937880591b63a3bdc54cf69d4e77f12c05ad9cc: Status 404 returned error can't find the container with id e87351a915f1c9d481f1b23e3937880591b63a3bdc54cf69d4e77f12c05ad9cc Apr 21 15:15:45.257252 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:45.257236 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:15:46.013579 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:46.013541 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" event={"ID":"c2583f6f-0d2b-4c11-a421-5eb9867ba322","Type":"ContainerStarted","Data":"e87351a915f1c9d481f1b23e3937880591b63a3bdc54cf69d4e77f12c05ad9cc"} Apr 21 15:15:51.031791 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:51.031750 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerID="b9a6336448bee51bca95b49d092acc283c9da0fd90ae23dcf1b43440dc29ac05" exitCode=0 Apr 21 15:15:51.032164 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:51.031824 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" event={"ID":"c2583f6f-0d2b-4c11-a421-5eb9867ba322","Type":"ContainerDied","Data":"b9a6336448bee51bca95b49d092acc283c9da0fd90ae23dcf1b43440dc29ac05"} Apr 21 15:15:57.049433 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:57.049394 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerID="c9bc28289a693c53274c8c12cd176dc91830f1d659cc6b998b4c0e013747e44c" exitCode=0 Apr 21 15:15:57.049901 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:15:57.049470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" event={"ID":"c2583f6f-0d2b-4c11-a421-5eb9867ba322","Type":"ContainerDied","Data":"c9bc28289a693c53274c8c12cd176dc91830f1d659cc6b998b4c0e013747e44c"} Apr 21 15:16:07.079458 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:07.079415 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerID="5a85a371ebca6888bb741c19f5205a4baf6d93cf9374384ec57cbb9462022251" exitCode=0 Apr 21 15:16:07.079818 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:07.079496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" event={"ID":"c2583f6f-0d2b-4c11-a421-5eb9867ba322","Type":"ContainerDied","Data":"5a85a371ebca6888bb741c19f5205a4baf6d93cf9374384ec57cbb9462022251"} Apr 21 15:16:08.200138 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.200114 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:16:08.254704 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.254674 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-bundle\") pod \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " Apr 21 15:16:08.254704 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.254707 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-util\") pod \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " Apr 21 15:16:08.254906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.254728 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg4dn\" (UniqueName: \"kubernetes.io/projected/c2583f6f-0d2b-4c11-a421-5eb9867ba322-kube-api-access-cg4dn\") pod \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\" (UID: \"c2583f6f-0d2b-4c11-a421-5eb9867ba322\") " Apr 21 15:16:08.255526 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.255499 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-bundle" (OuterVolumeSpecName: "bundle") pod "c2583f6f-0d2b-4c11-a421-5eb9867ba322" (UID: "c2583f6f-0d2b-4c11-a421-5eb9867ba322"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:16:08.256995 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.256973 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2583f6f-0d2b-4c11-a421-5eb9867ba322-kube-api-access-cg4dn" (OuterVolumeSpecName: "kube-api-access-cg4dn") pod "c2583f6f-0d2b-4c11-a421-5eb9867ba322" (UID: "c2583f6f-0d2b-4c11-a421-5eb9867ba322"). InnerVolumeSpecName "kube-api-access-cg4dn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:16:08.259990 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.259969 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-util" (OuterVolumeSpecName: "util") pod "c2583f6f-0d2b-4c11-a421-5eb9867ba322" (UID: "c2583f6f-0d2b-4c11-a421-5eb9867ba322"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:16:08.355275 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.355191 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-bundle\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:16:08.355275 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.355223 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2583f6f-0d2b-4c11-a421-5eb9867ba322-util\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:16:08.355275 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:08.355235 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cg4dn\" (UniqueName: \"kubernetes.io/projected/c2583f6f-0d2b-4c11-a421-5eb9867ba322-kube-api-access-cg4dn\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:16:09.086152 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:09.086121 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" Apr 21 15:16:09.086350 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:09.086120 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19dhrnpc" event={"ID":"c2583f6f-0d2b-4c11-a421-5eb9867ba322","Type":"ContainerDied","Data":"e87351a915f1c9d481f1b23e3937880591b63a3bdc54cf69d4e77f12c05ad9cc"} Apr 21 15:16:09.086350 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:09.086229 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87351a915f1c9d481f1b23e3937880591b63a3bdc54cf69d4e77f12c05ad9cc" Apr 21 15:16:13.282071 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282031 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc"] Apr 21 15:16:13.282775 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282752 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="util" Apr 21 15:16:13.282849 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282778 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="util" Apr 21 15:16:13.282849 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282817 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="pull" Apr 21 15:16:13.282849 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282826 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="pull" Apr 21 15:16:13.282849 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282841 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="extract" Apr 21 15:16:13.282849 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282850 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="extract" Apr 21 15:16:13.283080 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.282989 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2583f6f-0d2b-4c11-a421-5eb9867ba322" containerName="extract" Apr 21 15:16:13.286141 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.286119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.288818 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.288786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:16:13.288967 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.288944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 21 15:16:13.289051 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.289032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-tpc52\"" Apr 21 15:16:13.298328 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.298305 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc"] Apr 21 15:16:13.392560 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.392519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e03272e5-6ce1-4d34-84ba-e021bcd412f7-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-8mvbc\" (UID: \"e03272e5-6ce1-4d34-84ba-e021bcd412f7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.392736 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.392571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cchz\" (UniqueName: \"kubernetes.io/projected/e03272e5-6ce1-4d34-84ba-e021bcd412f7-kube-api-access-5cchz\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-8mvbc\" (UID: \"e03272e5-6ce1-4d34-84ba-e021bcd412f7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.493728 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.493683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e03272e5-6ce1-4d34-84ba-e021bcd412f7-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-8mvbc\" (UID: \"e03272e5-6ce1-4d34-84ba-e021bcd412f7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.493728 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.493741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cchz\" (UniqueName: \"kubernetes.io/projected/e03272e5-6ce1-4d34-84ba-e021bcd412f7-kube-api-access-5cchz\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-8mvbc\" (UID: \"e03272e5-6ce1-4d34-84ba-e021bcd412f7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.494074 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.494052 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e03272e5-6ce1-4d34-84ba-e021bcd412f7-tmp\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-8mvbc\" (UID: \"e03272e5-6ce1-4d34-84ba-e021bcd412f7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.503681 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.503652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cchz\" (UniqueName: \"kubernetes.io/projected/e03272e5-6ce1-4d34-84ba-e021bcd412f7-kube-api-access-5cchz\") pod \"cert-manager-operator-controller-manager-5b47fd69f5-8mvbc\" (UID: \"e03272e5-6ce1-4d34-84ba-e021bcd412f7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.595779 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.595678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" Apr 21 15:16:13.725351 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:13.725309 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc"] Apr 21 15:16:13.729192 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:16:13.729162 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode03272e5_6ce1_4d34_84ba_e021bcd412f7.slice/crio-01ff967ab57f8c74418e8ee1f9c64b20f109bf610aab965236e19c7a47792edb WatchSource:0}: Error finding container 01ff967ab57f8c74418e8ee1f9c64b20f109bf610aab965236e19c7a47792edb: Status 404 returned error can't find the container with id 01ff967ab57f8c74418e8ee1f9c64b20f109bf610aab965236e19c7a47792edb Apr 21 15:16:14.103008 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:14.102969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" event={"ID":"e03272e5-6ce1-4d34-84ba-e021bcd412f7","Type":"ContainerStarted","Data":"01ff967ab57f8c74418e8ee1f9c64b20f109bf610aab965236e19c7a47792edb"} Apr 21 15:16:16.111752 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:16.111717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" event={"ID":"e03272e5-6ce1-4d34-84ba-e021bcd412f7","Type":"ContainerStarted","Data":"eceb3ddf9cd2f0a3074adf356edd2298babfd9c8d7bef042161817595c23369f"} Apr 21 15:16:16.138434 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:16.138348 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5b47fd69f5-8mvbc" podStartSLOduration=1.723993906 podStartE2EDuration="3.138332814s" podCreationTimestamp="2026-04-21 15:16:13 +0000 UTC" firstStartedPulling="2026-04-21 15:16:13.731750842 +0000 UTC m=+342.327321695" lastFinishedPulling="2026-04-21 15:16:15.14608974 +0000 UTC m=+343.741660603" observedRunningTime="2026-04-21 15:16:16.135941948 +0000 UTC m=+344.731512826" watchObservedRunningTime="2026-04-21 15:16:16.138332814 +0000 UTC m=+344.733903688" Apr 21 15:16:17.528837 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.528802 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-zxbwv"] Apr 21 15:16:17.530918 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.530902 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.533719 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.533698 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 15:16:17.534626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.534607 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-8nqcg\"" Apr 21 15:16:17.534724 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.534628 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 15:16:17.561239 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.561215 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-zxbwv"] Apr 21 15:16:17.630130 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.630097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-zxbwv\" (UID: \"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.630286 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.630137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qkrb\" (UniqueName: \"kubernetes.io/projected/f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f-kube-api-access-8qkrb\") pod \"cert-manager-webhook-587ccfb98-zxbwv\" (UID: \"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.730811 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.730773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-zxbwv\" (UID: \"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.730976 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.730818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qkrb\" (UniqueName: \"kubernetes.io/projected/f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f-kube-api-access-8qkrb\") pod \"cert-manager-webhook-587ccfb98-zxbwv\" (UID: \"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.743023 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.742997 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-zxbwv\" (UID: \"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.743288 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.743267 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qkrb\" (UniqueName: \"kubernetes.io/projected/f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f-kube-api-access-8qkrb\") pod \"cert-manager-webhook-587ccfb98-zxbwv\" (UID: \"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f\") " pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.855636 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.855550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:17.985718 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:17.981129 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-zxbwv"] Apr 21 15:16:18.119833 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:18.119738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" event={"ID":"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f","Type":"ContainerStarted","Data":"1575334ca47f400ff9269edcddf1b84db477ab922adf29218e1362152c224535"} Apr 21 15:16:19.954876 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:19.954844 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-57p2c"] Apr 21 15:16:19.957666 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:19.957645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:19.962154 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:19.962136 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2l55h\"" Apr 21 15:16:19.973002 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:19.972976 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-57p2c"] Apr 21 15:16:20.052026 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.051990 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjbkz\" (UniqueName: \"kubernetes.io/projected/3b6a114f-98c8-4a80-81c4-2dbdedf9faa4-kube-api-access-tjbkz\") pod \"cert-manager-cainjector-68b757865b-57p2c\" (UID: \"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.052189 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.052099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b6a114f-98c8-4a80-81c4-2dbdedf9faa4-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-57p2c\" (UID: \"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.152977 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.152942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b6a114f-98c8-4a80-81c4-2dbdedf9faa4-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-57p2c\" (UID: \"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.153139 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.152995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjbkz\" (UniqueName: \"kubernetes.io/projected/3b6a114f-98c8-4a80-81c4-2dbdedf9faa4-kube-api-access-tjbkz\") pod \"cert-manager-cainjector-68b757865b-57p2c\" (UID: \"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.162751 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.162728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b6a114f-98c8-4a80-81c4-2dbdedf9faa4-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-57p2c\" (UID: \"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.172064 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.172043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjbkz\" (UniqueName: \"kubernetes.io/projected/3b6a114f-98c8-4a80-81c4-2dbdedf9faa4-kube-api-access-tjbkz\") pod \"cert-manager-cainjector-68b757865b-57p2c\" (UID: \"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4\") " pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.267277 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.267195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" Apr 21 15:16:20.396616 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:20.396592 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-57p2c"] Apr 21 15:16:20.399057 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:16:20.399033 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b6a114f_98c8_4a80_81c4_2dbdedf9faa4.slice/crio-9d25e22033222aa7eacb609518bed997814d287bf62eb1bb5c7cffcfcb4205b6 WatchSource:0}: Error finding container 9d25e22033222aa7eacb609518bed997814d287bf62eb1bb5c7cffcfcb4205b6: Status 404 returned error can't find the container with id 9d25e22033222aa7eacb609518bed997814d287bf62eb1bb5c7cffcfcb4205b6 Apr 21 15:16:21.131969 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:21.131927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" event={"ID":"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4","Type":"ContainerStarted","Data":"9d25e22033222aa7eacb609518bed997814d287bf62eb1bb5c7cffcfcb4205b6"} Apr 21 15:16:23.140947 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:23.140858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" event={"ID":"3b6a114f-98c8-4a80-81c4-2dbdedf9faa4","Type":"ContainerStarted","Data":"67c271986414369090ec0f238bd1793ff64d511c7e3ae09fbea5cbab463aae2f"} Apr 21 15:16:23.159884 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:23.159837 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-57p2c" podStartSLOduration=1.777367659 podStartE2EDuration="4.159823735s" podCreationTimestamp="2026-04-21 15:16:19 +0000 UTC" firstStartedPulling="2026-04-21 15:16:20.401030622 +0000 UTC m=+348.996601474" lastFinishedPulling="2026-04-21 15:16:22.783486696 +0000 UTC m=+351.379057550" observedRunningTime="2026-04-21 15:16:23.158290621 +0000 UTC m=+351.753861495" watchObservedRunningTime="2026-04-21 15:16:23.159823735 +0000 UTC m=+351.755394609" Apr 21 15:16:33.174998 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:33.174961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" event={"ID":"f6b4f2d5-3a37-40fe-afb2-b5581b4efa6f","Type":"ContainerStarted","Data":"9938338f84ff96cb2f74d9202bb76b4575c4fd04682a1b0d9b189173a1d9b46b"} Apr 21 15:16:33.175495 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:33.175075 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:33.199593 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:33.199542 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" podStartSLOduration=1.9410601889999999 podStartE2EDuration="16.199526915s" podCreationTimestamp="2026-04-21 15:16:17 +0000 UTC" firstStartedPulling="2026-04-21 15:16:17.987687278 +0000 UTC m=+346.583258146" lastFinishedPulling="2026-04-21 15:16:32.246154017 +0000 UTC m=+360.841724872" observedRunningTime="2026-04-21 15:16:33.197726775 +0000 UTC m=+361.793297650" watchObservedRunningTime="2026-04-21 15:16:33.199526915 +0000 UTC m=+361.795097790" Apr 21 15:16:36.784462 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.784423 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2ww5b"] Apr 21 15:16:36.786835 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.786815 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:36.789345 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.789327 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-b2rd8\"" Apr 21 15:16:36.798237 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.798215 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2ww5b"] Apr 21 15:16:36.898866 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.898834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndq42\" (UniqueName: \"kubernetes.io/projected/e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48-kube-api-access-ndq42\") pod \"cert-manager-79c8d999ff-2ww5b\" (UID: \"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48\") " pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:36.898866 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.898869 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48-bound-sa-token\") pod \"cert-manager-79c8d999ff-2ww5b\" (UID: \"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48\") " pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:36.999243 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.999192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndq42\" (UniqueName: \"kubernetes.io/projected/e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48-kube-api-access-ndq42\") pod \"cert-manager-79c8d999ff-2ww5b\" (UID: \"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48\") " pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:36.999243 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:36.999247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48-bound-sa-token\") pod \"cert-manager-79c8d999ff-2ww5b\" (UID: \"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48\") " pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:37.016482 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:37.016447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48-bound-sa-token\") pod \"cert-manager-79c8d999ff-2ww5b\" (UID: \"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48\") " pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:37.019969 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:37.019948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndq42\" (UniqueName: \"kubernetes.io/projected/e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48-kube-api-access-ndq42\") pod \"cert-manager-79c8d999ff-2ww5b\" (UID: \"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48\") " pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:37.096056 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:37.095986 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2ww5b" Apr 21 15:16:37.225617 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:37.225589 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2ww5b"] Apr 21 15:16:37.228156 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:16:37.228126 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7dd8ee8_d7c1_4446_a0ad_b5e85d827e48.slice/crio-7b18c6cbdf0b5d089dc2d6489849677b9bbe01a10e5887bad702400786ca7d43 WatchSource:0}: Error finding container 7b18c6cbdf0b5d089dc2d6489849677b9bbe01a10e5887bad702400786ca7d43: Status 404 returned error can't find the container with id 7b18c6cbdf0b5d089dc2d6489849677b9bbe01a10e5887bad702400786ca7d43 Apr 21 15:16:38.192965 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.192926 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2ww5b" event={"ID":"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48","Type":"ContainerStarted","Data":"caf2c30a34d48f3120a902649806a4f2b5e2e1688cd1915c66fd44f7e771a47b"} Apr 21 15:16:38.192965 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.192969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2ww5b" event={"ID":"e7dd8ee8-d7c1-4446-a0ad-b5e85d827e48","Type":"ContainerStarted","Data":"7b18c6cbdf0b5d089dc2d6489849677b9bbe01a10e5887bad702400786ca7d43"} Apr 21 15:16:38.220963 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.220918 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-2ww5b" podStartSLOduration=2.220904705 podStartE2EDuration="2.220904705s" podCreationTimestamp="2026-04-21 15:16:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:16:38.219554328 +0000 UTC m=+366.815125206" watchObservedRunningTime="2026-04-21 15:16:38.220904705 +0000 UTC m=+366.816475580" Apr 21 15:16:38.746168 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.746138 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l"] Apr 21 15:16:38.761995 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.761960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:38.762831 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.762807 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l"] Apr 21 15:16:38.764918 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.764893 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 21 15:16:38.765013 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.764925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 21 15:16:38.765883 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.765865 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-tsr8j\"" Apr 21 15:16:38.917511 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.917474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:38.917686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.917540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:38.917686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:38.917595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvkvn\" (UniqueName: \"kubernetes.io/projected/401714bb-afd7-4177-a4f1-79001eace1f7-kube-api-access-lvkvn\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.019073 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.018978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.019073 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.019023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvkvn\" (UniqueName: \"kubernetes.io/projected/401714bb-afd7-4177-a4f1-79001eace1f7-kube-api-access-lvkvn\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.019073 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.019075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.019425 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.019401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.019503 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.019431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.032046 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.032010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvkvn\" (UniqueName: \"kubernetes.io/projected/401714bb-afd7-4177-a4f1-79001eace1f7-kube-api-access-lvkvn\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.072724 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.072697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:39.181285 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.181256 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-zxbwv" Apr 21 15:16:39.204929 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:39.204908 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l"] Apr 21 15:16:39.207225 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:16:39.207198 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401714bb_afd7_4177_a4f1_79001eace1f7.slice/crio-43644ccad7df4ce09cf15794caf39f641b8624d579fdbce4229f31c386cc1a43 WatchSource:0}: Error finding container 43644ccad7df4ce09cf15794caf39f641b8624d579fdbce4229f31c386cc1a43: Status 404 returned error can't find the container with id 43644ccad7df4ce09cf15794caf39f641b8624d579fdbce4229f31c386cc1a43 Apr 21 15:16:40.200592 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:40.200559 2575 generic.go:358] "Generic (PLEG): container finished" podID="401714bb-afd7-4177-a4f1-79001eace1f7" containerID="b00392b0fb0beb1278dea14e34e9fc4140af6ec20e3797c605778c0189ea360e" exitCode=0 Apr 21 15:16:40.200592 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:40.200598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" event={"ID":"401714bb-afd7-4177-a4f1-79001eace1f7","Type":"ContainerDied","Data":"b00392b0fb0beb1278dea14e34e9fc4140af6ec20e3797c605778c0189ea360e"} Apr 21 15:16:40.200800 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:40.200619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" event={"ID":"401714bb-afd7-4177-a4f1-79001eace1f7","Type":"ContainerStarted","Data":"43644ccad7df4ce09cf15794caf39f641b8624d579fdbce4229f31c386cc1a43"} Apr 21 15:16:43.213076 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:43.213041 2575 generic.go:358] "Generic (PLEG): container finished" podID="401714bb-afd7-4177-a4f1-79001eace1f7" containerID="ed493fb90140d0a08131a9f3832814d2427ea6ddb0dfae1cddbe9a79e4a28037" exitCode=0 Apr 21 15:16:43.213476 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:43.213124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" event={"ID":"401714bb-afd7-4177-a4f1-79001eace1f7","Type":"ContainerDied","Data":"ed493fb90140d0a08131a9f3832814d2427ea6ddb0dfae1cddbe9a79e4a28037"} Apr 21 15:16:44.219434 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:44.219400 2575 generic.go:358] "Generic (PLEG): container finished" podID="401714bb-afd7-4177-a4f1-79001eace1f7" containerID="5bc28fdcb1c72de1ad5ad86dc90c446046b5a79eac97074a70021058129df4d0" exitCode=0 Apr 21 15:16:44.219804 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:44.219446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" event={"ID":"401714bb-afd7-4177-a4f1-79001eace1f7","Type":"ContainerDied","Data":"5bc28fdcb1c72de1ad5ad86dc90c446046b5a79eac97074a70021058129df4d0"} Apr 21 15:16:45.347063 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.347033 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:16:45.375190 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.375162 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvkvn\" (UniqueName: \"kubernetes.io/projected/401714bb-afd7-4177-a4f1-79001eace1f7-kube-api-access-lvkvn\") pod \"401714bb-afd7-4177-a4f1-79001eace1f7\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " Apr 21 15:16:45.375354 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.375214 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-util\") pod \"401714bb-afd7-4177-a4f1-79001eace1f7\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " Apr 21 15:16:45.375354 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.375237 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-bundle\") pod \"401714bb-afd7-4177-a4f1-79001eace1f7\" (UID: \"401714bb-afd7-4177-a4f1-79001eace1f7\") " Apr 21 15:16:45.375722 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.375689 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-bundle" (OuterVolumeSpecName: "bundle") pod "401714bb-afd7-4177-a4f1-79001eace1f7" (UID: "401714bb-afd7-4177-a4f1-79001eace1f7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:16:45.377883 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.377850 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401714bb-afd7-4177-a4f1-79001eace1f7-kube-api-access-lvkvn" (OuterVolumeSpecName: "kube-api-access-lvkvn") pod "401714bb-afd7-4177-a4f1-79001eace1f7" (UID: "401714bb-afd7-4177-a4f1-79001eace1f7"). InnerVolumeSpecName "kube-api-access-lvkvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:16:45.380053 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.380030 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-util" (OuterVolumeSpecName: "util") pod "401714bb-afd7-4177-a4f1-79001eace1f7" (UID: "401714bb-afd7-4177-a4f1-79001eace1f7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 15:16:45.475974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.475887 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvkvn\" (UniqueName: \"kubernetes.io/projected/401714bb-afd7-4177-a4f1-79001eace1f7-kube-api-access-lvkvn\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:16:45.475974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.475918 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-util\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:16:45.475974 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:45.475928 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/401714bb-afd7-4177-a4f1-79001eace1f7-bundle\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:16:46.228643 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:46.228611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" event={"ID":"401714bb-afd7-4177-a4f1-79001eace1f7","Type":"ContainerDied","Data":"43644ccad7df4ce09cf15794caf39f641b8624d579fdbce4229f31c386cc1a43"} Apr 21 15:16:46.228643 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:46.228647 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43644ccad7df4ce09cf15794caf39f641b8624d579fdbce4229f31c386cc1a43" Apr 21 15:16:46.228845 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:16:46.228725 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78emx94l" Apr 21 15:17:01.714212 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714177 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54"] Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714542 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="extract" Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714555 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="extract" Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714574 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="util" Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714581 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="util" Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714594 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="pull" Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714599 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="pull" Apr 21 15:17:01.714676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.714651 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="401714bb-afd7-4177-a4f1-79001eace1f7" containerName="extract" Apr 21 15:17:01.716612 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.716593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.720506 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.720480 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 21 15:17:01.720652 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.720528 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 21 15:17:01.720652 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.720593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 21 15:17:01.720652 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.720618 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 21 15:17:01.720811 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.720719 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 21 15:17:01.720811 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.720732 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-5f9rz\"" Apr 21 15:17:01.731875 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.731850 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54"] Apr 21 15:17:01.816800 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.816766 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z797h\" (UniqueName: \"kubernetes.io/projected/4d7bdab9-7364-4b67-9ac8-2517eaa48855-kube-api-access-z797h\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.816965 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.816819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d7bdab9-7364-4b67-9ac8-2517eaa48855-metrics-certs\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.816965 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.816868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d7bdab9-7364-4b67-9ac8-2517eaa48855-cert\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.816965 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.816921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4d7bdab9-7364-4b67-9ac8-2517eaa48855-manager-config\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.917802 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.917772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d7bdab9-7364-4b67-9ac8-2517eaa48855-metrics-certs\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.917941 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.917805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d7bdab9-7364-4b67-9ac8-2517eaa48855-cert\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.917941 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.917843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4d7bdab9-7364-4b67-9ac8-2517eaa48855-manager-config\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.917941 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.917864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z797h\" (UniqueName: \"kubernetes.io/projected/4d7bdab9-7364-4b67-9ac8-2517eaa48855-kube-api-access-z797h\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.918581 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.918554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4d7bdab9-7364-4b67-9ac8-2517eaa48855-manager-config\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.920288 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.920266 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4d7bdab9-7364-4b67-9ac8-2517eaa48855-cert\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.920405 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.920361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d7bdab9-7364-4b67-9ac8-2517eaa48855-metrics-certs\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:01.933790 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:01.933767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z797h\" (UniqueName: \"kubernetes.io/projected/4d7bdab9-7364-4b67-9ac8-2517eaa48855-kube-api-access-z797h\") pod \"jobset-controller-manager-787f44d59f-rbw54\" (UID: \"4d7bdab9-7364-4b67-9ac8-2517eaa48855\") " pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:02.026557 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:02.026479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:02.153699 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:02.153669 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54"] Apr 21 15:17:02.156722 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:17:02.156693 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d7bdab9_7364_4b67_9ac8_2517eaa48855.slice/crio-22cace5949c8666eab7d7ea3cab3a7918a23d4f0cdc1f6e59d1dd1dc940d7299 WatchSource:0}: Error finding container 22cace5949c8666eab7d7ea3cab3a7918a23d4f0cdc1f6e59d1dd1dc940d7299: Status 404 returned error can't find the container with id 22cace5949c8666eab7d7ea3cab3a7918a23d4f0cdc1f6e59d1dd1dc940d7299 Apr 21 15:17:02.286872 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:02.286789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" event={"ID":"4d7bdab9-7364-4b67-9ac8-2517eaa48855","Type":"ContainerStarted","Data":"22cace5949c8666eab7d7ea3cab3a7918a23d4f0cdc1f6e59d1dd1dc940d7299"} Apr 21 15:17:05.303056 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:05.303020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" event={"ID":"4d7bdab9-7364-4b67-9ac8-2517eaa48855","Type":"ContainerStarted","Data":"16e12a994764fceb09dc44ab098be29e975b1ce6e543ac9bc582ae6def79f7bf"} Apr 21 15:17:05.303546 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:05.303149 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:17:05.326523 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:05.326477 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" podStartSLOduration=1.6001714059999999 podStartE2EDuration="4.326465715s" podCreationTimestamp="2026-04-21 15:17:01 +0000 UTC" firstStartedPulling="2026-04-21 15:17:02.158472302 +0000 UTC m=+390.754043157" lastFinishedPulling="2026-04-21 15:17:04.88476661 +0000 UTC m=+393.480337466" observedRunningTime="2026-04-21 15:17:05.3247614 +0000 UTC m=+393.920332277" watchObservedRunningTime="2026-04-21 15:17:05.326465715 +0000 UTC m=+393.922036591" Apr 21 15:17:16.312135 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:17:16.312105 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-787f44d59f-rbw54" Apr 21 15:18:44.495897 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.495844 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78d9f59649-nrqqx"] Apr 21 15:18:44.499041 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.499023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.513768 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.513745 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d9f59649-nrqqx"] Apr 21 15:18:44.592110 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592074 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqw7l\" (UniqueName: \"kubernetes.io/projected/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-kube-api-access-vqw7l\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.592110 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-oauth-serving-cert\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.592401 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-serving-cert\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.592401 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-service-ca\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.592401 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-oauth-config\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.592401 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-config\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.592401 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.592360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-trusted-ca-bundle\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.692963 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.692929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-serving-cert\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.692963 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.692962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-service-ca\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693158 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.692980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-oauth-config\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693158 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.693038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-config\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693158 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.693076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-trusted-ca-bundle\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693158 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.693133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqw7l\" (UniqueName: \"kubernetes.io/projected/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-kube-api-access-vqw7l\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693331 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.693169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-oauth-serving-cert\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693872 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.693840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-service-ca\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.693972 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.693895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-oauth-serving-cert\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.694065 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.694038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-config\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.694165 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.694140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-trusted-ca-bundle\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.695685 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.695664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-serving-cert\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.695808 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.695789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-console-oauth-config\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.701782 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.701762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqw7l\" (UniqueName: \"kubernetes.io/projected/7fa46380-b342-4e6b-9ed5-fe4c74ebcac4-kube-api-access-vqw7l\") pod \"console-78d9f59649-nrqqx\" (UID: \"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4\") " pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.808298 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.808239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:44.934432 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:44.934353 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78d9f59649-nrqqx"] Apr 21 15:18:44.936781 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:18:44.936752 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa46380_b342_4e6b_9ed5_fe4c74ebcac4.slice/crio-f2af507074337d3df306f7ff07acf295ff69d83be88f91f84b9bbf6d1fd24713 WatchSource:0}: Error finding container f2af507074337d3df306f7ff07acf295ff69d83be88f91f84b9bbf6d1fd24713: Status 404 returned error can't find the container with id f2af507074337d3df306f7ff07acf295ff69d83be88f91f84b9bbf6d1fd24713 Apr 21 15:18:45.642536 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:45.642497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d9f59649-nrqqx" event={"ID":"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4","Type":"ContainerStarted","Data":"77c8f83e5920d6094306c2a42d5aa68658e7c0ade1bdcc7b8af82994d2e95305"} Apr 21 15:18:45.642536 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:45.642539 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78d9f59649-nrqqx" event={"ID":"7fa46380-b342-4e6b-9ed5-fe4c74ebcac4","Type":"ContainerStarted","Data":"f2af507074337d3df306f7ff07acf295ff69d83be88f91f84b9bbf6d1fd24713"} Apr 21 15:18:45.664703 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:45.664656 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78d9f59649-nrqqx" podStartSLOduration=1.664642046 podStartE2EDuration="1.664642046s" podCreationTimestamp="2026-04-21 15:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:18:45.663098193 +0000 UTC m=+494.258669069" watchObservedRunningTime="2026-04-21 15:18:45.664642046 +0000 UTC m=+494.260212921" Apr 21 15:18:54.808513 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:54.808459 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:54.808973 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:54.808529 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:54.813650 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:54.813626 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:55.682649 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:55.682618 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78d9f59649-nrqqx" Apr 21 15:18:55.749651 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:18:55.749615 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b74f6989f-n7465"] Apr 21 15:19:20.775864 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:20.775753 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b74f6989f-n7465" podUID="19952a24-739d-4153-b21b-c2bb018ef93b" containerName="console" containerID="cri-o://1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f" gracePeriod=15 Apr 21 15:19:20.871700 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:20.871661 2575 patch_prober.go:28] interesting pod/console-6b74f6989f-n7465 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.132.0.26:8443/health\": dial tcp 10.132.0.26:8443: connect: connection refused" start-of-body= Apr 21 15:19:20.871846 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:20.871720 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-6b74f6989f-n7465" podUID="19952a24-739d-4153-b21b-c2bb018ef93b" containerName="console" probeResult="failure" output="Get \"https://10.132.0.26:8443/health\": dial tcp 10.132.0.26:8443: connect: connection refused" Apr 21 15:19:21.008496 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.008473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b74f6989f-n7465_19952a24-739d-4153-b21b-c2bb018ef93b/console/0.log" Apr 21 15:19:21.008618 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.008532 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:19:21.116115 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116034 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-trusted-ca-bundle\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116115 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116088 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-console-config\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116321 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116127 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-oauth-serving-cert\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116321 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116153 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjlv2\" (UniqueName: \"kubernetes.io/projected/19952a24-739d-4153-b21b-c2bb018ef93b-kube-api-access-hjlv2\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116321 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116184 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-serving-cert\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116321 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116218 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-service-ca\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116321 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116242 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-oauth-config\") pod \"19952a24-739d-4153-b21b-c2bb018ef93b\" (UID: \"19952a24-739d-4153-b21b-c2bb018ef93b\") " Apr 21 15:19:21.116597 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116533 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:19:21.116683 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116655 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-service-ca" (OuterVolumeSpecName: "service-ca") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:19:21.116737 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116664 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:19:21.116809 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.116767 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-console-config" (OuterVolumeSpecName: "console-config") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 15:19:21.118600 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.118572 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:19:21.119046 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.119009 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 15:19:21.119046 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.119023 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19952a24-739d-4153-b21b-c2bb018ef93b-kube-api-access-hjlv2" (OuterVolumeSpecName: "kube-api-access-hjlv2") pod "19952a24-739d-4153-b21b-c2bb018ef93b" (UID: "19952a24-739d-4153-b21b-c2bb018ef93b"). InnerVolumeSpecName "kube-api-access-hjlv2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:19:21.217645 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217595 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-console-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.217645 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217641 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-oauth-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.217645 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217651 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjlv2\" (UniqueName: \"kubernetes.io/projected/19952a24-739d-4153-b21b-c2bb018ef93b-kube-api-access-hjlv2\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.217645 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217665 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-serving-cert\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.217920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217674 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-service-ca\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.217920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217682 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/19952a24-739d-4153-b21b-c2bb018ef93b-console-oauth-config\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.217920 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.217691 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19952a24-739d-4153-b21b-c2bb018ef93b-trusted-ca-bundle\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:19:21.767857 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.767831 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b74f6989f-n7465_19952a24-739d-4153-b21b-c2bb018ef93b/console/0.log" Apr 21 15:19:21.768055 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.767873 2575 generic.go:358] "Generic (PLEG): container finished" podID="19952a24-739d-4153-b21b-c2bb018ef93b" containerID="1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f" exitCode=2 Apr 21 15:19:21.768055 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.767913 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b74f6989f-n7465" event={"ID":"19952a24-739d-4153-b21b-c2bb018ef93b","Type":"ContainerDied","Data":"1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f"} Apr 21 15:19:21.768055 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.767954 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b74f6989f-n7465" Apr 21 15:19:21.768055 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.767961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b74f6989f-n7465" event={"ID":"19952a24-739d-4153-b21b-c2bb018ef93b","Type":"ContainerDied","Data":"5ee18d61bd757c338e8c5c6aea299454c30974b413b986b380433632969d407b"} Apr 21 15:19:21.768055 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.767982 2575 scope.go:117] "RemoveContainer" containerID="1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f" Apr 21 15:19:21.776592 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.776413 2575 scope.go:117] "RemoveContainer" containerID="1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f" Apr 21 15:19:21.776796 ip-10-0-131-11 kubenswrapper[2575]: E0421 15:19:21.776669 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f\": container with ID starting with 1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f not found: ID does not exist" containerID="1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f" Apr 21 15:19:21.776796 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.776697 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f"} err="failed to get container status \"1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f\": rpc error: code = NotFound desc = could not find container \"1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f\": container with ID starting with 1ca337ed36d9c996bbb6c51749b16980c49ba138ee8ab4f5500203fcc79c442f not found: ID does not exist" Apr 21 15:19:21.789025 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.789000 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b74f6989f-n7465"] Apr 21 15:19:21.793907 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.793878 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b74f6989f-n7465"] Apr 21 15:19:21.980133 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:19:21.980103 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19952a24-739d-4153-b21b-c2bb018ef93b" path="/var/lib/kubelet/pods/19952a24-739d-4153-b21b-c2bb018ef93b/volumes" Apr 21 15:20:31.887918 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:20:31.887890 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:20:31.889405 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:20:31.889360 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:20:31.892109 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:20:31.892091 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:20:31.893471 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:20:31.893452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:22:28.587391 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.587290 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls"] Apr 21 15:22:28.587782 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.587649 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="19952a24-739d-4153-b21b-c2bb018ef93b" containerName="console" Apr 21 15:22:28.587782 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.587663 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="19952a24-739d-4153-b21b-c2bb018ef93b" containerName="console" Apr 21 15:22:28.587782 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.587714 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="19952a24-739d-4153-b21b-c2bb018ef93b" containerName="console" Apr 21 15:22:28.590995 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.590973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:22:28.593720 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.593700 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-92tmr\"/\"default-dockercfg-hkrtq\"" Apr 21 15:22:28.593844 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.593701 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"kube-root-ca.crt\"" Apr 21 15:22:28.593844 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.593744 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-92tmr\"/\"openshift-service-ca.crt\"" Apr 21 15:22:28.599656 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.599633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls"] Apr 21 15:22:28.707536 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.707503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tf8l\" (UniqueName: \"kubernetes.io/projected/25673be4-7840-48d4-93cd-0b3add3a4539-kube-api-access-5tf8l\") pod \"progression-job-failure-node-0-0-krzls\" (UID: \"25673be4-7840-48d4-93cd-0b3add3a4539\") " pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:22:28.808919 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.808872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tf8l\" (UniqueName: \"kubernetes.io/projected/25673be4-7840-48d4-93cd-0b3add3a4539-kube-api-access-5tf8l\") pod \"progression-job-failure-node-0-0-krzls\" (UID: \"25673be4-7840-48d4-93cd-0b3add3a4539\") " pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:22:28.817334 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.817307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tf8l\" (UniqueName: \"kubernetes.io/projected/25673be4-7840-48d4-93cd-0b3add3a4539-kube-api-access-5tf8l\") pod \"progression-job-failure-node-0-0-krzls\" (UID: \"25673be4-7840-48d4-93cd-0b3add3a4539\") " pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:22:28.901102 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:28.901029 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:22:29.024610 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:29.024584 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls"] Apr 21 15:22:29.027136 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:22:29.027097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25673be4_7840_48d4_93cd_0b3add3a4539.slice/crio-9458ac401e264de7d6a3f0da357aa20afb4abf5487ed06631d838de1dd26403e WatchSource:0}: Error finding container 9458ac401e264de7d6a3f0da357aa20afb4abf5487ed06631d838de1dd26403e: Status 404 returned error can't find the container with id 9458ac401e264de7d6a3f0da357aa20afb4abf5487ed06631d838de1dd26403e Apr 21 15:22:29.029454 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:29.029438 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 15:22:29.401647 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:22:29.401607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" event={"ID":"25673be4-7840-48d4-93cd-0b3add3a4539","Type":"ContainerStarted","Data":"9458ac401e264de7d6a3f0da357aa20afb4abf5487ed06631d838de1dd26403e"} Apr 21 15:24:22.837220 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:22.837181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" event={"ID":"25673be4-7840-48d4-93cd-0b3add3a4539","Type":"ContainerStarted","Data":"18d737c54fb3baca00bb38db7f3aa9e6aed85203fe5f1e9b6bb2ea3217080f86"} Apr 21 15:24:22.837656 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:22.837345 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:24:22.856444 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:22.856391 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" podStartSLOduration=1.600382689 podStartE2EDuration="1m54.856359409s" podCreationTimestamp="2026-04-21 15:22:28 +0000 UTC" firstStartedPulling="2026-04-21 15:22:29.029583339 +0000 UTC m=+717.625154193" lastFinishedPulling="2026-04-21 15:24:22.285560047 +0000 UTC m=+830.881130913" observedRunningTime="2026-04-21 15:24:22.853958643 +0000 UTC m=+831.449529589" watchObservedRunningTime="2026-04-21 15:24:22.856359409 +0000 UTC m=+831.451930283" Apr 21 15:24:24.844107 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:24.844076 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:24:31.841711 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:31.841668 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" podUID="25673be4-7840-48d4-93cd-0b3add3a4539" containerName="node" probeResult="failure" output="Get \"http://10.132.0.36:28080/metrics\": dial tcp 10.132.0.36:28080: connect: connection refused" Apr 21 15:24:31.875280 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:31.875249 2575 generic.go:358] "Generic (PLEG): container finished" podID="25673be4-7840-48d4-93cd-0b3add3a4539" containerID="18d737c54fb3baca00bb38db7f3aa9e6aed85203fe5f1e9b6bb2ea3217080f86" exitCode=1 Apr 21 15:24:31.875428 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:31.875299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" event={"ID":"25673be4-7840-48d4-93cd-0b3add3a4539","Type":"ContainerDied","Data":"18d737c54fb3baca00bb38db7f3aa9e6aed85203fe5f1e9b6bb2ea3217080f86"} Apr 21 15:24:32.999835 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:32.999812 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:24:33.029071 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:33.029040 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tf8l\" (UniqueName: \"kubernetes.io/projected/25673be4-7840-48d4-93cd-0b3add3a4539-kube-api-access-5tf8l\") pod \"25673be4-7840-48d4-93cd-0b3add3a4539\" (UID: \"25673be4-7840-48d4-93cd-0b3add3a4539\") " Apr 21 15:24:33.031468 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:33.031438 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25673be4-7840-48d4-93cd-0b3add3a4539-kube-api-access-5tf8l" (OuterVolumeSpecName: "kube-api-access-5tf8l") pod "25673be4-7840-48d4-93cd-0b3add3a4539" (UID: "25673be4-7840-48d4-93cd-0b3add3a4539"). InnerVolumeSpecName "kube-api-access-5tf8l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 15:24:33.129716 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:33.129641 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tf8l\" (UniqueName: \"kubernetes.io/projected/25673be4-7840-48d4-93cd-0b3add3a4539-kube-api-access-5tf8l\") on node \"ip-10-0-131-11.ec2.internal\" DevicePath \"\"" Apr 21 15:24:33.883002 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:33.882972 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" Apr 21 15:24:33.883002 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:33.882982 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls" event={"ID":"25673be4-7840-48d4-93cd-0b3add3a4539","Type":"ContainerDied","Data":"9458ac401e264de7d6a3f0da357aa20afb4abf5487ed06631d838de1dd26403e"} Apr 21 15:24:33.883002 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:33.883010 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9458ac401e264de7d6a3f0da357aa20afb4abf5487ed06631d838de1dd26403e" Apr 21 15:24:51.637004 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:51.636963 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls"] Apr 21 15:24:51.640412 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:51.640383 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-92tmr/progression-job-failure-node-0-0-krzls"] Apr 21 15:24:51.979813 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:24:51.979778 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25673be4-7840-48d4-93cd-0b3add3a4539" path="/var/lib/kubelet/pods/25673be4-7840-48d4-93cd-0b3add3a4539/volumes" Apr 21 15:25:31.915408 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:31.915315 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:25:31.918118 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:31.918095 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:25:31.919719 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:31.919701 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:25:31.922149 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:31.922128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:25:43.751328 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:43.751293 2575 ???:1] "http: TLS handshake error from 10.0.137.168:57436: EOF" Apr 21 15:25:43.754140 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:43.754120 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-p62vr_01bee1dc-5579-465f-b2a5-53d7e0d89cae/global-pull-secret-syncer/0.log" Apr 21 15:25:43.895298 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:43.895268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rw7pj_3b4c8c1d-205b-43ec-9a00-3a7d4989ef9e/konnectivity-agent/0.log" Apr 21 15:25:43.914906 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:43.914877 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-11.ec2.internal_f339e642e0590241739bde2677ce9dbf/haproxy/0.log" Apr 21 15:25:46.892729 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:46.892702 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/alertmanager/0.log" Apr 21 15:25:46.923270 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:46.923242 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/config-reloader/0.log" Apr 21 15:25:46.962857 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:46.962822 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/kube-rbac-proxy-web/0.log" Apr 21 15:25:46.997773 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:46.997751 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/kube-rbac-proxy/0.log" Apr 21 15:25:47.034521 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.034499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/kube-rbac-proxy-metric/0.log" Apr 21 15:25:47.067623 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.067603 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/prom-label-proxy/0.log" Apr 21 15:25:47.093860 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.093835 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_9ebd8426-ec64-4b31-85fe-7c033ba6ff5c/init-config-reloader/0.log" Apr 21 15:25:47.133705 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.133674 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-7q6q5_7af4b8c2-dc3e-4dbc-8156-428c4db62671/cluster-monitoring-operator/0.log" Apr 21 15:25:47.167484 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.167402 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5rlzv_f72e9f53-15f9-4ca0-9463-60b025086a02/kube-state-metrics/0.log" Apr 21 15:25:47.199144 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.199119 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5rlzv_f72e9f53-15f9-4ca0-9463-60b025086a02/kube-rbac-proxy-main/0.log" Apr 21 15:25:47.254118 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.254089 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5rlzv_f72e9f53-15f9-4ca0-9463-60b025086a02/kube-rbac-proxy-self/0.log" Apr 21 15:25:47.299657 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.299627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7fd774b49d-59zfj_c9f97ac5-48ff-4c64-a04a-cb1d469f81ed/metrics-server/0.log" Apr 21 15:25:47.379452 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.379428 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5wdv_387913de-81bd-4750-b6c9-7e10d0d68401/node-exporter/0.log" Apr 21 15:25:47.414487 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.414456 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5wdv_387913de-81bd-4750-b6c9-7e10d0d68401/kube-rbac-proxy/0.log" Apr 21 15:25:47.440856 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.440802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h5wdv_387913de-81bd-4750-b6c9-7e10d0d68401/init-textfile/0.log" Apr 21 15:25:47.940198 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.940167 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-6m8bq_064e9703-0c22-4ecf-8b43-2473e8986b8b/prometheus-operator/0.log" Apr 21 15:25:47.982227 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:47.982203 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-6m8bq_064e9703-0c22-4ecf-8b43-2473e8986b8b/kube-rbac-proxy/0.log" Apr 21 15:25:48.027586 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:48.027562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-cgs42_9de978e2-783c-4dfd-ac14-798e4da5e14a/prometheus-operator-admission-webhook/0.log" Apr 21 15:25:49.441665 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:49.441630 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-gdmsq_b8e66f46-305b-420c-9537-07986d9fd92a/networking-console-plugin/0.log" Apr 21 15:25:49.808626 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:49.808591 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/1.log" Apr 21 15:25:49.813467 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:49.813445 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-bqs6l_2e6ce7e1-9053-481a-924e-dcb6e2859d45/console-operator/2.log" Apr 21 15:25:50.128146 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.128077 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772"] Apr 21 15:25:50.128436 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.128422 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25673be4-7840-48d4-93cd-0b3add3a4539" containerName="node" Apr 21 15:25:50.128485 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.128437 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="25673be4-7840-48d4-93cd-0b3add3a4539" containerName="node" Apr 21 15:25:50.128527 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.128518 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="25673be4-7840-48d4-93cd-0b3add3a4539" containerName="node" Apr 21 15:25:50.130246 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.130231 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.134246 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.134222 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p2xz6\"/\"openshift-service-ca.crt\"" Apr 21 15:25:50.135322 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.135304 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-p2xz6\"/\"default-dockercfg-flhhr\"" Apr 21 15:25:50.135491 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.135344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-p2xz6\"/\"kube-root-ca.crt\"" Apr 21 15:25:50.144038 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.144017 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772"] Apr 21 15:25:50.223340 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.223309 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78d9f59649-nrqqx_7fa46380-b342-4e6b-9ed5-fe4c74ebcac4/console/0.log" Apr 21 15:25:50.252577 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.252544 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-t25vf_abead490-d43d-4f40-bf13-41e9c8573f7f/download-server/0.log" Apr 21 15:25:50.255733 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.255713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-sys\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.255816 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.255749 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wvd\" (UniqueName: \"kubernetes.io/projected/6b7b5938-78c5-4615-8c06-da0f25426011-kube-api-access-r7wvd\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.255816 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.255768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-podres\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.255891 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.255820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-lib-modules\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.255891 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.255855 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-proc\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356667 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356623 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wvd\" (UniqueName: \"kubernetes.io/projected/6b7b5938-78c5-4615-8c06-da0f25426011-kube-api-access-r7wvd\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356667 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-podres\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-lib-modules\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-proc\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-sys\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-lib-modules\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-podres\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-proc\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.356908 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.356864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b7b5938-78c5-4615-8c06-da0f25426011-sys\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.367593 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.367566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wvd\" (UniqueName: \"kubernetes.io/projected/6b7b5938-78c5-4615-8c06-da0f25426011-kube-api-access-r7wvd\") pod \"perf-node-gather-daemonset-4r772\" (UID: \"6b7b5938-78c5-4615-8c06-da0f25426011\") " pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.439917 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.439826 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:50.614441 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.614402 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772"] Apr 21 15:25:50.617745 ip-10-0-131-11 kubenswrapper[2575]: W0421 15:25:50.617712 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b7b5938_78c5_4615_8c06_da0f25426011.slice/crio-b6c3fd79a0a72dc136a0fc79249d237a67cdcf49161f0b92d6eaf025f17d751f WatchSource:0}: Error finding container b6c3fd79a0a72dc136a0fc79249d237a67cdcf49161f0b92d6eaf025f17d751f: Status 404 returned error can't find the container with id b6c3fd79a0a72dc136a0fc79249d237a67cdcf49161f0b92d6eaf025f17d751f Apr 21 15:25:50.707415 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:50.707393 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-k5mdv_00b7c8c2-18f1-4ec5-a6f8-97f9dd462a77/volume-data-source-validator/0.log" Apr 21 15:25:51.154050 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.153954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" event={"ID":"6b7b5938-78c5-4615-8c06-da0f25426011","Type":"ContainerStarted","Data":"35304f3b981c6d7a494e04811a7d57763fbb5a6fcd1f64f7f799e8c904862928"} Apr 21 15:25:51.154050 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.154000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" event={"ID":"6b7b5938-78c5-4615-8c06-da0f25426011","Type":"ContainerStarted","Data":"b6c3fd79a0a72dc136a0fc79249d237a67cdcf49161f0b92d6eaf025f17d751f"} Apr 21 15:25:51.154050 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.154043 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:51.172396 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.172332 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" podStartSLOduration=1.172316699 podStartE2EDuration="1.172316699s" podCreationTimestamp="2026-04-21 15:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 15:25:51.171223074 +0000 UTC m=+919.766793965" watchObservedRunningTime="2026-04-21 15:25:51.172316699 +0000 UTC m=+919.767887577" Apr 21 15:25:51.379987 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.379961 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fr77k_a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be/dns/0.log" Apr 21 15:25:51.402426 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.402362 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fr77k_a8a95ba4-2cdf-4ab6-8c6f-5269bfdca8be/kube-rbac-proxy/0.log" Apr 21 15:25:51.497184 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.497157 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hdbc9_71299419-e249-4660-891c-24ba490f5c36/dns-node-resolver/0.log" Apr 21 15:25:51.911932 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:51.911852 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4hwvs_cfb26da3-8175-4742-a038-7b5d5d082af2/node-ca/0.log" Apr 21 15:25:52.951951 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:52.951921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dmzt8_578cec2c-16fd-469e-931a-b7cf421795a1/serve-healthcheck-canary/0.log" Apr 21 15:25:53.395307 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:53.395220 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-46kkk_a8c54d77-8e37-4e0f-8831-80a62651bcaa/kube-rbac-proxy/0.log" Apr 21 15:25:53.421432 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:53.421405 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-46kkk_a8c54d77-8e37-4e0f-8831-80a62651bcaa/exporter/0.log" Apr 21 15:25:53.450279 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:53.450253 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-46kkk_a8c54d77-8e37-4e0f-8831-80a62651bcaa/extractor/0.log" Apr 21 15:25:55.256978 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:55.256951 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-787f44d59f-rbw54_4d7bdab9-7364-4b67-9ac8-2517eaa48855/manager/0.log" Apr 21 15:25:57.167442 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:57.167413 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-p2xz6/perf-node-gather-daemonset-4r772" Apr 21 15:25:58.908149 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:58.908116 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wcz2b_b4f29a84-c932-46e5-8d58-0e2fb5ab05f8/kube-storage-version-migrator-operator/1.log" Apr 21 15:25:58.909137 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:25:58.909120 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wcz2b_b4f29a84-c932-46e5-8d58-0e2fb5ab05f8/kube-storage-version-migrator-operator/0.log" Apr 21 15:26:00.145849 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.145779 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/kube-multus-additional-cni-plugins/0.log" Apr 21 15:26:00.179936 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.179905 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/egress-router-binary-copy/0.log" Apr 21 15:26:00.209665 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.209639 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/cni-plugins/0.log" Apr 21 15:26:00.244431 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.244402 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/bond-cni-plugin/0.log" Apr 21 15:26:00.269975 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.269955 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/routeoverride-cni/0.log" Apr 21 15:26:00.293572 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.293550 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/whereabouts-cni-bincopy/0.log" Apr 21 15:26:00.315748 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.315721 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-m4ccx_0a58254f-d46b-4b42-b89e-5f65cdf19d34/whereabouts-cni/0.log" Apr 21 15:26:00.577262 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.577230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jkh68_c1596851-f71d-43c8-b7f4-f92f0a29bb06/kube-multus/0.log" Apr 21 15:26:00.706410 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.706347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-96snf_a17c6c3f-25ab-4414-92a4-946230c882ea/network-metrics-daemon/0.log" Apr 21 15:26:00.728254 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:00.728225 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-96snf_a17c6c3f-25ab-4414-92a4-946230c882ea/kube-rbac-proxy/0.log" Apr 21 15:26:02.274161 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.274133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-controller/0.log" Apr 21 15:26:02.315667 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.315636 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/0.log" Apr 21 15:26:02.318797 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.318776 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovn-acl-logging/1.log" Apr 21 15:26:02.343686 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.343666 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/kube-rbac-proxy-node/0.log" Apr 21 15:26:02.373955 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.373932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 15:26:02.404415 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.404364 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/northd/0.log" Apr 21 15:26:02.431922 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.431892 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/nbdb/0.log" Apr 21 15:26:02.460088 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.460064 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/sbdb/0.log" Apr 21 15:26:02.567804 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:02.567730 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mbnjg_0bdcacfa-6992-4542-8cbe-df76abfeb25b/ovnkube-controller/0.log" Apr 21 15:26:03.643429 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:03.643398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-hfcw6_9562db30-abde-4e93-96e5-77429f548f83/network-check-target-container/0.log" Apr 21 15:26:04.597916 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:04.597886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rjhlf_981aa28d-2a57-4f14-8411-4d80c9ed2911/iptables-alerter/0.log" Apr 21 15:26:05.402676 ip-10-0-131-11 kubenswrapper[2575]: I0421 15:26:05.402648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-h47wt_85bafbc7-2166-40e1-825d-81c20339ab1e/tuned/0.log"