Apr 17 20:48:26.653280 ip-10-0-132-12 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 20:48:26.653289 ip-10-0-132-12 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 20:48:26.653296 ip-10-0-132-12 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 20:48:26.653514 ip-10-0-132-12 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 20:48:36.821751 ip-10-0-132-12 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 20:48:36.821768 ip-10-0-132-12 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e1c45205a3e2462ea40169cc6a00f3f2 -- Apr 17 20:50:55.416505 ip-10-0-132-12 systemd[1]: Starting Kubernetes Kubelet... Apr 17 20:50:55.850403 ip-10-0-132-12 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:55.850403 ip-10-0-132-12 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 20:50:55.850403 ip-10-0-132-12 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:55.850403 ip-10-0-132-12 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 20:50:55.850403 ip-10-0-132-12 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 20:50:55.851091 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.850473 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 20:50:55.856896 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856875 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:55.856896 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856892 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:55.856896 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856897 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856900 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856903 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856906 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856909 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856912 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856914 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856917 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856920 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856923 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856926 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856928 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856931 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856933 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856936 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856939 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856942 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856945 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856957 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:55.856995 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856969 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856972 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856975 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856977 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856980 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856982 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856985 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856987 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856990 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856993 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856995 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.856998 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857001 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857004 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857007 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857010 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857012 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857015 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857018 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:55.857453 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857023 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857027 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857030 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857033 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857036 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857038 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857041 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857044 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857047 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857049 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857052 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857054 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857057 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857059 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857062 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857066 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857068 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857071 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857073 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857076 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:55.857913 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857079 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857081 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857084 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857086 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857089 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857091 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857094 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857096 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857099 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857101 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857104 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857108 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857112 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857115 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857118 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857120 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857123 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857125 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857128 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:55.858406 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857131 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857133 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857136 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857138 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857141 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857143 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857146 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857561 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857569 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857573 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857577 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857581 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857584 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857587 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857590 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857593 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857596 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857599 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857601 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:55.858936 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857604 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857607 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857609 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857611 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857614 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857616 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857619 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857622 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857624 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857627 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857629 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857631 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857634 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857636 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857639 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857641 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857644 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857646 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857649 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857652 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:55.859429 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857654 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857658 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857661 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857663 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857666 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857670 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857672 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857675 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857678 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857680 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857683 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857685 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857687 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857690 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857692 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857695 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857697 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857700 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857703 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:55.859949 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857706 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857708 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857711 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857713 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857716 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857718 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857721 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857724 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857726 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857728 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857731 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857733 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857736 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857738 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857741 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857744 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857747 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857749 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857752 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857754 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:55.860428 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857757 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857759 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857762 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857765 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857767 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857769 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857772 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857775 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857777 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857780 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857783 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857785 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857788 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857791 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.857793 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857860 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857867 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857883 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857888 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857893 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857896 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 20:50:55.860910 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857900 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857905 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857908 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857911 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857915 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857919 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857922 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857926 2575 flags.go:64] FLAG: --cgroup-root="" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857929 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857933 2575 flags.go:64] FLAG: --client-ca-file="" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857936 2575 flags.go:64] FLAG: --cloud-config="" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857939 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857942 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857946 2575 flags.go:64] FLAG: --cluster-domain="" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857949 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857952 2575 flags.go:64] FLAG: --config-dir="" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857955 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857959 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857962 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857965 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857968 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857972 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857975 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857978 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 20:50:55.861440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857981 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857984 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857987 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857991 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857995 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.857998 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858001 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858004 2575 flags.go:64] FLAG: --enable-server="true" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858007 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858012 2575 flags.go:64] FLAG: --event-burst="100" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858015 2575 flags.go:64] FLAG: --event-qps="50" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858018 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858021 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858024 2575 flags.go:64] FLAG: --eviction-hard="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858028 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858031 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858034 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858037 2575 flags.go:64] FLAG: --eviction-soft="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858040 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858043 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858046 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858049 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858052 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858055 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858058 2575 flags.go:64] FLAG: --feature-gates="" Apr 17 20:50:55.862019 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858061 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858064 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858070 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858073 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858076 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858079 2575 flags.go:64] FLAG: --help="false" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858082 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-132-12.ec2.internal" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858085 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858088 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858091 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858094 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858098 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858100 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858103 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858106 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858112 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858115 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858118 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858121 2575 flags.go:64] FLAG: --kube-reserved="" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858124 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858127 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858130 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858133 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858136 2575 flags.go:64] FLAG: --lock-file="" Apr 17 20:50:55.862627 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858148 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858151 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858154 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858164 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858166 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858169 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858172 2575 flags.go:64] FLAG: --logging-format="text" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858175 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858179 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858182 2575 flags.go:64] FLAG: --manifest-url="" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858186 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858190 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858193 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858197 2575 flags.go:64] FLAG: --max-pods="110" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858200 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858203 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858206 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858209 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858212 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858215 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858218 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858239 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858242 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858246 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 20:50:55.863205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858250 2575 flags.go:64] FLAG: --pod-cidr="" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858253 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858258 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858261 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858264 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858267 2575 flags.go:64] FLAG: --port="10250" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858270 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858273 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00aed9f95fffb690e" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858277 2575 flags.go:64] FLAG: --qos-reserved="" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858282 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858285 2575 flags.go:64] FLAG: --register-node="true" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858288 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858291 2575 flags.go:64] FLAG: --register-with-taints="" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858295 2575 flags.go:64] FLAG: --registry-burst="10" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858298 2575 flags.go:64] FLAG: --registry-qps="5" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858300 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858303 2575 flags.go:64] FLAG: --reserved-memory="" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858307 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858311 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858314 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858317 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858320 2575 flags.go:64] FLAG: --runonce="false" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858323 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858326 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858329 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 17 20:50:55.863807 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858332 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858335 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858338 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858341 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858344 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858347 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858350 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858354 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858357 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858360 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858363 2575 flags.go:64] FLAG: --system-cgroups="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858366 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858372 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858375 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858378 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858382 2575 flags.go:64] FLAG: --tls-min-version="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858385 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858388 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858391 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858394 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858397 2575 flags.go:64] FLAG: --v="2" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858401 2575 flags.go:64] FLAG: --version="false" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858405 2575 flags.go:64] FLAG: --vmodule="" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858409 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.858412 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 20:50:55.864424 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858502 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858506 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858509 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858512 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858515 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858517 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858520 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858522 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858525 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858528 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858531 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858534 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858536 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858540 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858545 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858549 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858552 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858554 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858557 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:55.865464 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858560 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858562 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858565 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858568 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858570 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858573 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858576 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858578 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858581 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858584 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858586 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858589 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858592 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858595 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858598 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858600 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858603 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858605 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858608 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858610 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:55.866068 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858613 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858615 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858618 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858621 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858623 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858626 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858628 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858632 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858635 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858637 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858640 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858642 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858645 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858648 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858650 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858653 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858655 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858658 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858661 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:55.866682 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858663 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858666 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858669 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858671 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858674 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858678 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858683 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858687 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858690 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858692 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858695 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858697 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858700 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858702 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858705 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858707 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858710 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858713 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858715 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858718 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:55.867150 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858722 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858724 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858727 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858729 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858732 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858735 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858737 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.858740 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.859402 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.867408 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.867526 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867577 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867582 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867586 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867589 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867593 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:55.867663 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867596 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867598 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867601 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867604 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867607 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867609 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867613 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867615 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867618 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867621 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867623 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867626 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867629 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867631 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867634 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867636 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867639 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867642 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867644 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867646 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:55.868074 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867654 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867657 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867660 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867663 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867666 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867668 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867672 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867675 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867678 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867680 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867683 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867687 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867691 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867695 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867697 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867700 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867703 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867706 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867708 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:55.868676 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867711 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867714 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867717 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867719 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867722 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867725 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867727 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867730 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867733 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867735 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867737 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867740 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867743 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867746 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867749 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867752 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867754 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867757 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867759 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867762 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:55.869180 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867765 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867768 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867770 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867772 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867775 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867777 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867779 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867782 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867784 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867787 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867789 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867792 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867795 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867800 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867804 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867807 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867810 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867813 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867815 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867818 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:55.869680 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867820 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867823 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.867828 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867931 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867936 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867939 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867942 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867945 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867948 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867951 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867954 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867957 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867959 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867962 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867964 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 20:50:55.870156 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867967 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867970 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867972 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867975 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867977 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867980 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867982 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867985 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867988 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867991 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867996 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.867999 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868001 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868004 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868007 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868009 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868012 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868014 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868017 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 20:50:55.870553 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868019 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868022 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868024 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868027 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868029 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868032 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868035 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868037 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868040 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868042 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868045 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868047 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868050 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868053 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868056 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868060 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868063 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868065 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868068 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 20:50:55.871042 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868070 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868073 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868075 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868078 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868081 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868084 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868087 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868089 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868092 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868095 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868097 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868100 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868102 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868104 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868107 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868110 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868112 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868114 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868117 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868119 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 20:50:55.871559 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868122 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868125 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868128 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868130 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868133 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868135 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868138 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868140 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868142 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868145 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868147 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868150 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868152 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868154 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868157 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.868159 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 20:50:55.872041 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.868164 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 20:50:55.872537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.868292 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 20:50:55.872537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.870388 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 20:50:55.872537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.871337 2575 server.go:1019] "Starting client certificate rotation" Apr 17 20:50:55.872537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.871428 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:50:55.872537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.871462 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 20:50:55.894634 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.894611 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:50:55.897495 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.897473 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 20:50:55.910330 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.910309 2575 log.go:25] "Validated CRI v1 runtime API" Apr 17 20:50:55.916698 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.916675 2575 log.go:25] "Validated CRI v1 image API" Apr 17 20:50:55.918026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.918011 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 20:50:55.920445 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.920427 2575 fs.go:135] Filesystem UUIDs: map[3be89f19-8f35-4606-9b16-bfc82fac9907:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 eda96f84-5ac2-4a95-b775-f2f6b94f3191:/dev/nvme0n1p4] Apr 17 20:50:55.920494 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.920446 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 20:50:55.926046 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.925935 2575 manager.go:217] Machine: {Timestamp:2026-04-17 20:50:55.924133997 +0000 UTC m=+0.391574922 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092494 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2365ea54cc74b4c56c7555e1025239 SystemUUID:ec2365ea-54cc-74b4-c56c-7555e1025239 BootID:e1c45205-a3e2-462e-a401-69cc6a00f3f2 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1a:82:23:bd:09 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1a:82:23:bd:09 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:0a:6c:98:80:87 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 20:50:55.926234 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.926207 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:50:55.926678 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.926667 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 20:50:55.926825 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.926806 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 20:50:55.929088 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.929062 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 20:50:55.929254 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.929092 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-12.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 20:50:55.929304 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.929264 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 20:50:55.929304 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.929272 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 20:50:55.929304 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.929294 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:50:55.930251 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.930239 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 20:50:55.931614 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.931604 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:50:55.931720 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.931711 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 20:50:55.934132 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.934122 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 17 20:50:55.934170 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.934136 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 20:50:55.934170 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.934147 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 20:50:55.934170 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.934157 2575 kubelet.go:397] "Adding apiserver pod source" Apr 17 20:50:55.934170 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.934165 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 20:50:55.935450 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.935439 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:50:55.935496 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.935457 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 20:50:55.938805 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.938783 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 20:50:55.941120 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.941103 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 20:50:55.942413 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942398 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942418 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942427 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942434 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942443 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942452 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942460 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942468 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942479 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942488 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 20:50:55.942498 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942501 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 20:50:55.942778 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.942515 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 20:50:55.943401 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.943390 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 20:50:55.943450 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.943403 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 20:50:55.945330 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:55.945294 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-12.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 20:50:55.945450 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:55.945363 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 20:50:55.946629 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.946609 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2kg2" Apr 17 20:50:55.947717 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.947701 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-12.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 20:50:55.947811 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.947798 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 20:50:55.947870 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.947845 2575 server.go:1295] "Started kubelet" Apr 17 20:50:55.947935 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.947908 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 20:50:55.948018 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.947976 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 20:50:55.948113 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.948044 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 20:50:55.948734 ip-10-0-132-12 systemd[1]: Started Kubernetes Kubelet. Apr 17 20:50:55.949950 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.949930 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 20:50:55.950666 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.950653 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 17 20:50:55.955414 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.955288 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x2kg2" Apr 17 20:50:55.957196 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.957181 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 20:50:55.957196 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.957191 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 20:50:55.958113 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958094 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 20:50:55.958113 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:55.955968 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-12.ec2.internal.18a740196bcb8e61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-12.ec2.internal,UID:ip-10-0-132-12.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-12.ec2.internal,},FirstTimestamp:2026-04-17 20:50:55.947812449 +0000 UTC m=+0.415253383,LastTimestamp:2026-04-17 20:50:55.947812449 +0000 UTC m=+0.415253383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-12.ec2.internal,}" Apr 17 20:50:55.958289 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958121 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 20:50:55.958289 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958150 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 20:50:55.958380 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958324 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 17 20:50:55.958380 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958333 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958477 2575 factory.go:55] Registering systemd factory Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958503 2575 factory.go:223] Registration of the systemd container factory successfully Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958803 2575 factory.go:153] Registering CRI-O factory Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958840 2575 factory.go:223] Registration of the crio container factory successfully Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958887 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958912 2575 factory.go:103] Registering Raw factory Apr 17 20:50:55.959026 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.958933 2575 manager.go:1196] Started watching for new ooms in manager Apr 17 20:50:55.959576 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:55.959345 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:55.959576 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.959481 2575 manager.go:319] Starting recovery of all containers Apr 17 20:50:55.961931 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:55.961877 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 20:50:55.965902 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.965883 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:55.968977 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:55.968957 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-12.ec2.internal\" not found" node="ip-10-0-132-12.ec2.internal" Apr 17 20:50:55.969357 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:55.969338 2575 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.min": read /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service/memory.min: no such device Apr 17 20:50:55.970185 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.970174 2575 manager.go:324] Recovery completed Apr 17 20:50:55.974114 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.974103 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:55.976395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.976377 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:55.976464 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.976408 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:55.976464 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.976419 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:55.976897 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.976883 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 20:50:55.976897 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.976894 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 20:50:55.976978 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.976928 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 20:50:55.980045 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.980034 2575 policy_none.go:49] "None policy: Start" Apr 17 20:50:55.980081 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.980049 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 20:50:55.980081 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:55.980060 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 17 20:50:56.023815 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.023798 2575 manager.go:341] "Starting Device Plugin manager" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.023862 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.023873 2575 server.go:85] "Starting device plugin registration server" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.024096 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.024106 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.024203 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.024310 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.024321 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.025084 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 20:50:56.045992 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.025118 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.124860 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.124778 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:56.125897 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.125877 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:56.126003 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.125908 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:56.126003 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.125918 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:56.126003 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.125944 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.127210 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.127189 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 20:50:56.128520 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.128492 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 20:50:56.128520 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.128520 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 20:50:56.128664 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.128541 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 20:50:56.128664 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.128548 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 20:50:56.128664 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.128621 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 20:50:56.131985 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.131968 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:56.133857 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.133837 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.133984 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.133862 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-12.ec2.internal\": node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.184682 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.184648 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.229781 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.229721 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal"] Apr 17 20:50:56.229943 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.229831 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:56.231350 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.231329 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:56.231436 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.231364 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:56.231436 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.231375 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:56.233629 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.233616 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:56.233783 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.233768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.233816 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.233797 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:56.234389 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.234374 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:56.234458 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.234401 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:56.234458 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.234411 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:56.234458 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.234374 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:56.234543 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.234470 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:56.234543 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.234484 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:56.236573 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.236559 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.236622 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.236584 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 20:50:56.237260 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.237244 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientMemory" Apr 17 20:50:56.237324 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.237276 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 20:50:56.237324 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.237288 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeHasSufficientPID" Apr 17 20:50:56.256793 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.256774 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-12.ec2.internal\" not found" node="ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.258900 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.258883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77f82645937a0b11c10ce2a57afcb470-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal\" (UID: \"77f82645937a0b11c10ce2a57afcb470\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.258979 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.258910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77f82645937a0b11c10ce2a57afcb470-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal\" (UID: \"77f82645937a0b11c10ce2a57afcb470\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.258979 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.258935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dbe6d7617e05cfcc664fd92a25ec45f3-config\") pod \"kube-apiserver-proxy-ip-10-0-132-12.ec2.internal\" (UID: \"dbe6d7617e05cfcc664fd92a25ec45f3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.260863 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.260849 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-12.ec2.internal\" not found" node="ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.285553 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.285531 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.359293 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.359244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77f82645937a0b11c10ce2a57afcb470-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal\" (UID: \"77f82645937a0b11c10ce2a57afcb470\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.359411 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.359306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dbe6d7617e05cfcc664fd92a25ec45f3-config\") pod \"kube-apiserver-proxy-ip-10-0-132-12.ec2.internal\" (UID: \"dbe6d7617e05cfcc664fd92a25ec45f3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.359411 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.359323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77f82645937a0b11c10ce2a57afcb470-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal\" (UID: \"77f82645937a0b11c10ce2a57afcb470\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.359411 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.359351 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/77f82645937a0b11c10ce2a57afcb470-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal\" (UID: \"77f82645937a0b11c10ce2a57afcb470\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.359411 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.359272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/77f82645937a0b11c10ce2a57afcb470-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal\" (UID: \"77f82645937a0b11c10ce2a57afcb470\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.359546 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.359428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dbe6d7617e05cfcc664fd92a25ec45f3-config\") pod \"kube-apiserver-proxy-ip-10-0-132-12.ec2.internal\" (UID: \"dbe6d7617e05cfcc664fd92a25ec45f3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.386417 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.386353 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.487131 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.487093 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.558344 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.558320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.563028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.563008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.587753 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.587718 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.688380 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.688277 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.788778 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.788731 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.871267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.871240 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 20:50:56.871879 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.871378 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:50:56.871879 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.871406 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 20:50:56.889403 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:56.889375 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-12.ec2.internal\" not found" Apr 17 20:50:56.953967 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.953907 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:56.958130 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.958105 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.958279 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.958128 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 20:50:56.959186 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.959164 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 20:45:55 +0000 UTC" deadline="2027-12-25 08:01:12.741809688 +0000 UTC" Apr 17 20:50:56.959284 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.959187 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14795h10m15.782626611s" Apr 17 20:50:56.969516 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.969494 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 20:50:56.975084 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.975065 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:50:56.976769 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.976755 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" Apr 17 20:50:56.984665 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.984645 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 20:50:56.997735 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:56.997718 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j9jhz" Apr 17 20:50:57.005501 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.005483 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j9jhz" Apr 17 20:50:57.126937 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:57.126706 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe6d7617e05cfcc664fd92a25ec45f3.slice/crio-6df96bab63612633850ee241d85a93295200929c30b5220c712c67b3fe62e351 WatchSource:0}: Error finding container 6df96bab63612633850ee241d85a93295200929c30b5220c712c67b3fe62e351: Status 404 returned error can't find the container with id 6df96bab63612633850ee241d85a93295200929c30b5220c712c67b3fe62e351 Apr 17 20:50:57.127188 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:57.127169 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f82645937a0b11c10ce2a57afcb470.slice/crio-233c9a4f68ba762d03359285ff92c08eb18f6525e6e8fa55d17fb59f757eac9d WatchSource:0}: Error finding container 233c9a4f68ba762d03359285ff92c08eb18f6525e6e8fa55d17fb59f757eac9d: Status 404 returned error can't find the container with id 233c9a4f68ba762d03359285ff92c08eb18f6525e6e8fa55d17fb59f757eac9d Apr 17 20:50:57.131047 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.131030 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:50:57.131249 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.131196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" event={"ID":"dbe6d7617e05cfcc664fd92a25ec45f3","Type":"ContainerStarted","Data":"6df96bab63612633850ee241d85a93295200929c30b5220c712c67b3fe62e351"} Apr 17 20:50:57.132160 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.132140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" event={"ID":"77f82645937a0b11c10ce2a57afcb470","Type":"ContainerStarted","Data":"233c9a4f68ba762d03359285ff92c08eb18f6525e6e8fa55d17fb59f757eac9d"} Apr 17 20:50:57.166635 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.166612 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:57.819612 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.819414 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:57.935806 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.935776 2575 apiserver.go:52] "Watching apiserver" Apr 17 20:50:57.943628 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.943597 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 20:50:57.944752 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.944725 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-gn7zb","openshift-network-diagnostics/network-check-target-zgp64","openshift-network-operator/iptables-alerter-fh4xj","openshift-ovn-kubernetes/ovnkube-node-j8fq5","kube-system/konnectivity-agent-c796r","kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk","openshift-dns/node-resolver-7rtzd","openshift-image-registry/node-ca-ddpb6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal","openshift-multus/multus-47m56","openshift-multus/multus-additional-cni-plugins-tgvzd","openshift-cluster-node-tuning-operator/tuned-84567"] Apr 17 20:50:57.947298 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.947277 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:57.947378 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:57.947344 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:50:57.951538 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.951514 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:50:57.951648 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:57.951592 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:50:57.952740 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.952311 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:57.954836 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.954808 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:57.954836 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.954825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.954985 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.954825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 20:50:57.955374 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.955353 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2gpzj\"" Apr 17 20:50:57.956133 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.956111 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.958786 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.958552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:57.958786 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.958735 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 20:50:57.959381 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.959362 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 20:50:57.959381 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.959377 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 20:50:57.959530 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.959393 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.959666 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.959651 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hkpd2\"" Apr 17 20:50:57.959666 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.959660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 20:50:57.959843 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.959830 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 20:50:57.960870 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.960505 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 20:50:57.960870 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.960712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bxtgt\"" Apr 17 20:50:57.960870 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.960726 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 20:50:57.961119 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.961103 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:57.963168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.963150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:57.963524 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.963505 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.963615 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.963598 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 20:50:57.963982 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.963962 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7qbl7\"" Apr 17 20:50:57.964182 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.964165 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 20:50:57.965405 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.965384 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cf6db\"" Apr 17 20:50:57.965516 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.965386 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.965516 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.965498 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:57.965699 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.965680 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 20:50:57.967754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.967735 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 20:50:57.967864 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.967778 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 20:50:57.967945 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.967919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47m56" Apr 17 20:50:57.968043 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.968205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-scxbk\"" Apr 17 20:50:57.968484 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968579 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968497 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-cni-netd\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968579 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:57.968579 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968564 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968730 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-slash\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968730 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc7zz\" (UniqueName: \"kubernetes.io/projected/03974a46-a9e1-4161-8f82-8e72fdfcb759-kube-api-access-sc7zz\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:57.968730 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-node-log\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968730 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-env-overrides\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968730 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968704 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovn-node-metrics-cert\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-etc-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-log-socket\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovnkube-script-lib\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.968957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-host-slash\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.968978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7s9w\" (UniqueName: \"kubernetes.io/projected/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-kube-api-access-d7s9w\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-kubelet\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-cni-bin\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969054 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwr7l\" (UniqueName: \"kubernetes.io/projected/e7fec351-05e2-48e2-8266-8f2093ebb3fe-kube-api-access-mwr7l\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969094 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/743a363d-0753-4e3b-9c99-a494c15dcf32-agent-certs\") pod \"konnectivity-agent-c796r\" (UID: \"743a363d-0753-4e3b-9c99-a494c15dcf32\") " pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/743a363d-0753-4e3b-9c99-a494c15dcf32-konnectivity-ca\") pod \"konnectivity-agent-c796r\" (UID: \"743a363d-0753-4e3b-9c99-a494c15dcf32\") " pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:57.969152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969144 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-iptables-alerter-script\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-var-lib-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969293 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-run-netns\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969339 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovnkube-config\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969363 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-systemd-units\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-systemd\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.969519 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.969411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-ovn\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:57.970215 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.970194 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 20:50:57.970215 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.970210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-w29pc\"" Apr 17 20:50:57.970958 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.970435 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 20:50:57.970958 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.970463 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 20:50:57.970958 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.970564 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:57.970958 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.970654 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.972651 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.972612 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ggbzf\"" Apr 17 20:50:57.972651 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.972613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 20:50:57.972778 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.972613 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 20:50:57.972879 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.972865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:57.974948 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.974927 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:50:57.975152 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.975134 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bv6n4\"" Apr 17 20:50:57.975267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:57.975189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 20:50:58.006204 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.006173 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:45:56 +0000 UTC" deadline="2027-10-15 20:48:56.913151872 +0000 UTC" Apr 17 20:50:58.006204 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.006202 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13103h57m58.906953978s" Apr 17 20:50:58.059021 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.058985 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 20:50:58.070006 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.069930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-etc-selinux\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.070006 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.069972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-os-release\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.070006 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-system-cni-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6560b30d-aca7-45a9-b8b5-3fb4711c4650-cni-binary-copy\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-multus-certs\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-ovn\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070139 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-run\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-ovn\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-tuned\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-os-release\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-k8s-cni-cncf-io\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-etc-kubernetes\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-device-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-slash\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070460 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-systemd\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070481 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-node-log\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovn-node-metrics-cert\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-lib-modules\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-socket-dir-parent\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070530 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-node-log\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bsw5\" (UniqueName: \"kubernetes.io/projected/6560b30d-aca7-45a9-b8b5-3fb4711c4650-kube-api-access-7bsw5\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-slash\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-etc-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-etc-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070726 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070730 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovnkube-script-lib\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-cnibin\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-cni-bin\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwr7l\" (UniqueName: \"kubernetes.io/projected/e7fec351-05e2-48e2-8266-8f2093ebb3fe-kube-api-access-mwr7l\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysctl-d\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-cni-bin\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-kubelet\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-iptables-alerter-script\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.070934 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-cnibin\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.070980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-run-netns\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovnkube-config\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-run-netns\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-tmp-dir\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zswl2\" (UniqueName: \"kubernetes.io/projected/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-kube-api-access-zswl2\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071083 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-sys\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsx9v\" (UniqueName: \"kubernetes.io/projected/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-kube-api-access-rsx9v\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-systemd-units\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-systemd\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071162 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-cni-netd\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-daemon-config\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-cni-netd\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.071695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-system-cni-dir\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-kubernetes\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.071352 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-systemd-units\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.071418 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:58.571398365 +0000 UTC m=+3.038839280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-systemd\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071434 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovnkube-script-lib\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-host\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071465 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-iptables-alerter-script\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-run-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-cni-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-hostroot\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-conf-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovnkube-config\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-registration-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-sys-fs\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-246xd\" (UniqueName: \"kubernetes.io/projected/710e13ba-1421-4758-a855-0e9df651899c-kube-api-access-246xd\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.072504 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysctl-conf\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rchct\" (UniqueName: \"kubernetes.io/projected/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-kube-api-access-rchct\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-netns\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sc7zz\" (UniqueName: \"kubernetes.io/projected/03974a46-a9e1-4161-8f82-8e72fdfcb759-kube-api-access-sc7zz\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-env-overrides\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-var-lib-kubelet\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-log-socket\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-socket-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.071970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-log-socket\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-hosts-file\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-cni-binary-copy\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072106 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56tnw\" (UniqueName: \"kubernetes.io/projected/38d8ca4c-2571-491a-bb52-21191288881d-kube-api-access-56tnw\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-serviceca\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-host-slash\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d7s9w\" (UniqueName: \"kubernetes.io/projected/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-kube-api-access-d7s9w\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-host-slash\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.073351 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-kubelet\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e7fec351-05e2-48e2-8266-8f2093ebb3fe-env-overrides\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072390 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/743a363d-0753-4e3b-9c99-a494c15dcf32-agent-certs\") pod \"konnectivity-agent-c796r\" (UID: \"743a363d-0753-4e3b-9c99-a494c15dcf32\") " pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-host-kubelet\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/743a363d-0753-4e3b-9c99-a494c15dcf32-konnectivity-ca\") pod \"konnectivity-agent-c796r\" (UID: \"743a363d-0753-4e3b-9c99-a494c15dcf32\") " pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-modprobe-d\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-host\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-tmp\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-var-lib-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysconfig\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-cni-bin\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-cni-multus\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.072904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e7fec351-05e2-48e2-8266-8f2093ebb3fe-var-lib-openvswitch\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.074131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.073042 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/743a363d-0753-4e3b-9c99-a494c15dcf32-konnectivity-ca\") pod \"konnectivity-agent-c796r\" (UID: \"743a363d-0753-4e3b-9c99-a494c15dcf32\") " pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:58.074882 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.074606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e7fec351-05e2-48e2-8266-8f2093ebb3fe-ovn-node-metrics-cert\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.075098 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.075080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/743a363d-0753-4e3b-9c99-a494c15dcf32-agent-certs\") pod \"konnectivity-agent-c796r\" (UID: \"743a363d-0753-4e3b-9c99-a494c15dcf32\") " pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:58.088455 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.088431 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:58.088455 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.088455 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:58.088648 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.088468 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:58.088648 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.088544 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:58.588525451 +0000 UTC m=+3.055966378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:58.090337 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.090308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwr7l\" (UniqueName: \"kubernetes.io/projected/e7fec351-05e2-48e2-8266-8f2093ebb3fe-kube-api-access-mwr7l\") pod \"ovnkube-node-j8fq5\" (UID: \"e7fec351-05e2-48e2-8266-8f2093ebb3fe\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.091036 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.091018 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7s9w\" (UniqueName: \"kubernetes.io/projected/76b2eed9-aba9-4fc0-8c47-32a44d9073bf-kube-api-access-d7s9w\") pod \"iptables-alerter-fh4xj\" (UID: \"76b2eed9-aba9-4fc0-8c47-32a44d9073bf\") " pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.091144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.091127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc7zz\" (UniqueName: \"kubernetes.io/projected/03974a46-a9e1-4161-8f82-8e72fdfcb759-kube-api-access-sc7zz\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:58.173878 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.173838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-multus-certs\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174051 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.173893 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-run\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174051 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.173923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-tuned\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174051 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.173930 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-multus-certs\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174051 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.173948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-os-release\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174051 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.173999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-k8s-cni-cncf-io\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174051 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-etc-kubernetes\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-device-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-k8s-cni-cncf-io\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-run\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-systemd\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-lib-modules\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-systemd\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174201 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-socket-dir-parent\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-os-release\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bsw5\" (UniqueName: \"kubernetes.io/projected/6560b30d-aca7-45a9-b8b5-3fb4711c4650-kube-api-access-7bsw5\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-device-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-cnibin\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-socket-dir-parent\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-etc-kubernetes\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.174371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174309 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysctl-d\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-cnibin\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.174957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysctl-d\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.174957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-kubelet\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.175088 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.174984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.175088 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175044 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-kubelet\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.175088 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-cnibin\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.175217 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-tmp-dir\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.175217 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-cnibin\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.175217 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-lib-modules\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.175217 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zswl2\" (UniqueName: \"kubernetes.io/projected/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-kube-api-access-zswl2\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.175472 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-tmp-dir\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.175767 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-sys\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.175854 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsx9v\" (UniqueName: \"kubernetes.io/projected/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-kube-api-access-rsx9v\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.175854 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-daemon-config\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.175947 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-system-cni-dir\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.175947 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-kubernetes\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.175947 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-host\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.176065 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-cni-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.176065 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.175992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-hostroot\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.176065 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-conf-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.176184 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-registration-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.176184 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-sys-fs\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.176184 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-246xd\" (UniqueName: \"kubernetes.io/projected/710e13ba-1421-4758-a855-0e9df651899c-kube-api-access-246xd\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.177810 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysctl-conf\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.177923 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rchct\" (UniqueName: \"kubernetes.io/projected/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-kube-api-access-rchct\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.177923 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177866 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-netns\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.177923 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-var-lib-kubelet\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178069 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.178069 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-socket-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.178069 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-hosts-file\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.178069 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-cni-binary-copy\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56tnw\" (UniqueName: \"kubernetes.io/projected/38d8ca4c-2571-491a-bb52-21191288881d-kube-api-access-56tnw\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-serviceca\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-modprobe-d\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-sys\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-host\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176518 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-host\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-conf-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-tmp\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178280 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-cni-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysconfig\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-cni-bin\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-cni-multus\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176609 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-tuned\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-etc-selinux\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-etc-selinux\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.178723 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.179143 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.179143 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176751 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-system-cni-dir\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.179143 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-socket-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.179143 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-sys-fs\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.179143 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.178914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-hosts-file\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.179143 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-kubernetes\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179285 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysconfig\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-cni-bin\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179342 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-serviceca\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-hostroot\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.177165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6560b30d-aca7-45a9-b8b5-3fb4711c4650-multus-daemon-config\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38d8ca4c-2571-491a-bb52-21191288881d-cni-binary-copy\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.176888 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/710e13ba-1421-4758-a855-0e9df651899c-registration-dir\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.179446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-var-lib-cni-multus\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-host-run-netns\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-modprobe-d\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-host\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-var-lib-kubelet\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-os-release\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-etc-sysctl-conf\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-system-cni-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38d8ca4c-2571-491a-bb52-21191288881d-os-release\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6560b30d-aca7-45a9-b8b5-3fb4711c4650-cni-binary-copy\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.179837 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.179643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6560b30d-aca7-45a9-b8b5-3fb4711c4650-system-cni-dir\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.180640 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.180614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6560b30d-aca7-45a9-b8b5-3fb4711c4650-cni-binary-copy\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.180762 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.180741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-tmp\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.183191 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.183166 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zswl2\" (UniqueName: \"kubernetes.io/projected/2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236-kube-api-access-zswl2\") pod \"node-resolver-7rtzd\" (UID: \"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236\") " pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.183310 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.183295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bsw5\" (UniqueName: \"kubernetes.io/projected/6560b30d-aca7-45a9-b8b5-3fb4711c4650-kube-api-access-7bsw5\") pod \"multus-47m56\" (UID: \"6560b30d-aca7-45a9-b8b5-3fb4711c4650\") " pod="openshift-multus/multus-47m56" Apr 17 20:50:58.184238 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.184182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsx9v\" (UniqueName: \"kubernetes.io/projected/5f57b5b6-848e-48b0-b1ca-e8a8c29c446c-kube-api-access-rsx9v\") pod \"node-ca-ddpb6\" (UID: \"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c\") " pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.185086 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.185059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-246xd\" (UniqueName: \"kubernetes.io/projected/710e13ba-1421-4758-a855-0e9df651899c-kube-api-access-246xd\") pod \"aws-ebs-csi-driver-node-m2kpk\" (UID: \"710e13ba-1421-4758-a855-0e9df651899c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.185599 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.185578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchct\" (UniqueName: \"kubernetes.io/projected/8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7-kube-api-access-rchct\") pod \"tuned-84567\" (UID: \"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7\") " pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.185902 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.185881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56tnw\" (UniqueName: \"kubernetes.io/projected/38d8ca4c-2571-491a-bb52-21191288881d-kube-api-access-56tnw\") pod \"multus-additional-cni-plugins-tgvzd\" (UID: \"38d8ca4c-2571-491a-bb52-21191288881d\") " pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.212664 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.212640 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 20:50:58.266089 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.266055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fh4xj" Apr 17 20:50:58.272981 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.272947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:50:58.280662 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.280639 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-c796r" Apr 17 20:50:58.286322 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.286294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" Apr 17 20:50:58.292889 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.292852 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7rtzd" Apr 17 20:50:58.299468 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.299449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-ddpb6" Apr 17 20:50:58.306018 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.305989 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-47m56" Apr 17 20:50:58.313421 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.313398 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" Apr 17 20:50:58.318034 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.318008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-84567" Apr 17 20:50:58.583300 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.583257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:58.583480 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.583396 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:58.583480 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.583460 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:59.583443432 +0000 UTC m=+4.050884347 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:58.684597 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:58.684559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:50:58.684802 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.684736 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:58.684802 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.684762 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:58.684802 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.684776 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:58.684956 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:58.684841 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:50:59.684822339 +0000 UTC m=+4.152263272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:58.799260 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.799234 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c8e47f1_f9e0_4eaf_aee0_87ec4156f3d7.slice/crio-d793706dc34db7d15922a8d1f3074ecdf28d56434ab395c3eb6bddbcf355b8e3 WatchSource:0}: Error finding container d793706dc34db7d15922a8d1f3074ecdf28d56434ab395c3eb6bddbcf355b8e3: Status 404 returned error can't find the container with id d793706dc34db7d15922a8d1f3074ecdf28d56434ab395c3eb6bddbcf355b8e3 Apr 17 20:50:58.800669 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.800648 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6560b30d_aca7_45a9_b8b5_3fb4711c4650.slice/crio-2f763b60d8d7dd8a949756792e56a73d8932e60d73739dd24344a92a97adfd6a WatchSource:0}: Error finding container 2f763b60d8d7dd8a949756792e56a73d8932e60d73739dd24344a92a97adfd6a: Status 404 returned error can't find the container with id 2f763b60d8d7dd8a949756792e56a73d8932e60d73739dd24344a92a97adfd6a Apr 17 20:50:58.802064 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.802042 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743a363d_0753_4e3b_9c99_a494c15dcf32.slice/crio-73c474af427b3d36be395eab37a9e6659d62001389981215ce9d0c5d6add2dfb WatchSource:0}: Error finding container 73c474af427b3d36be395eab37a9e6659d62001389981215ce9d0c5d6add2dfb: Status 404 returned error can't find the container with id 73c474af427b3d36be395eab37a9e6659d62001389981215ce9d0c5d6add2dfb Apr 17 20:50:58.805145 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.805121 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7fec351_05e2_48e2_8266_8f2093ebb3fe.slice/crio-688f2418f7a2e1b13ea8c21685eb3244c281c288beed3b088f180e4f20ae0e1a WatchSource:0}: Error finding container 688f2418f7a2e1b13ea8c21685eb3244c281c288beed3b088f180e4f20ae0e1a: Status 404 returned error can't find the container with id 688f2418f7a2e1b13ea8c21685eb3244c281c288beed3b088f180e4f20ae0e1a Apr 17 20:50:58.805880 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.805855 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b2eed9_aba9_4fc0_8c47_32a44d9073bf.slice/crio-db7f8d5049a2aa722ade1b86a0d1423a03dd23d7d23829c93419e47d7c546df8 WatchSource:0}: Error finding container db7f8d5049a2aa722ade1b86a0d1423a03dd23d7d23829c93419e47d7c546df8: Status 404 returned error can't find the container with id db7f8d5049a2aa722ade1b86a0d1423a03dd23d7d23829c93419e47d7c546df8 Apr 17 20:50:58.806791 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.806765 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod710e13ba_1421_4758_a855_0e9df651899c.slice/crio-a4bc834a704e8f8c104649c4f5a7dac55f4923625dc83e5def5d944e09c48654 WatchSource:0}: Error finding container a4bc834a704e8f8c104649c4f5a7dac55f4923625dc83e5def5d944e09c48654: Status 404 returned error can't find the container with id a4bc834a704e8f8c104649c4f5a7dac55f4923625dc83e5def5d944e09c48654 Apr 17 20:50:58.807322 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.807217 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d8ca4c_2571_491a_bb52_21191288881d.slice/crio-d74db90c23acff1b2deb475c9d578e680da0e22ae9579dc2bf674f1516005ff7 WatchSource:0}: Error finding container d74db90c23acff1b2deb475c9d578e680da0e22ae9579dc2bf674f1516005ff7: Status 404 returned error can't find the container with id d74db90c23acff1b2deb475c9d578e680da0e22ae9579dc2bf674f1516005ff7 Apr 17 20:50:58.811096 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:50:58.810974 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f57b5b6_848e_48b0_b1ca_e8a8c29c446c.slice/crio-4f86f134b8678d02c740746ca9ae261ff524d3f9f87b4a3da101828fae4a8ebb WatchSource:0}: Error finding container 4f86f134b8678d02c740746ca9ae261ff524d3f9f87b4a3da101828fae4a8ebb: Status 404 returned error can't find the container with id 4f86f134b8678d02c740746ca9ae261ff524d3f9f87b4a3da101828fae4a8ebb Apr 17 20:50:59.006651 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.006611 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 20:45:56 +0000 UTC" deadline="2027-12-03 12:09:27.315909562 +0000 UTC" Apr 17 20:50:59.006651 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.006643 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14271h18m28.309268952s" Apr 17 20:50:59.141595 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.141519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7rtzd" event={"ID":"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236","Type":"ContainerStarted","Data":"d6582f6ff5e48218586da38fafdeba1a772b422f668ca384c5c62f1b8c843f51"} Apr 17 20:50:59.144788 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.144759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerStarted","Data":"d74db90c23acff1b2deb475c9d578e680da0e22ae9579dc2bf674f1516005ff7"} Apr 17 20:50:59.147827 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.147802 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" event={"ID":"710e13ba-1421-4758-a855-0e9df651899c","Type":"ContainerStarted","Data":"a4bc834a704e8f8c104649c4f5a7dac55f4923625dc83e5def5d944e09c48654"} Apr 17 20:50:59.148832 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.148808 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"688f2418f7a2e1b13ea8c21685eb3244c281c288beed3b088f180e4f20ae0e1a"} Apr 17 20:50:59.151518 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.151496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47m56" event={"ID":"6560b30d-aca7-45a9-b8b5-3fb4711c4650","Type":"ContainerStarted","Data":"2f763b60d8d7dd8a949756792e56a73d8932e60d73739dd24344a92a97adfd6a"} Apr 17 20:50:59.152408 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.152383 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fh4xj" event={"ID":"76b2eed9-aba9-4fc0-8c47-32a44d9073bf","Type":"ContainerStarted","Data":"db7f8d5049a2aa722ade1b86a0d1423a03dd23d7d23829c93419e47d7c546df8"} Apr 17 20:50:59.153368 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.153342 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c796r" event={"ID":"743a363d-0753-4e3b-9c99-a494c15dcf32","Type":"ContainerStarted","Data":"73c474af427b3d36be395eab37a9e6659d62001389981215ce9d0c5d6add2dfb"} Apr 17 20:50:59.154252 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.154218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-84567" event={"ID":"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7","Type":"ContainerStarted","Data":"d793706dc34db7d15922a8d1f3074ecdf28d56434ab395c3eb6bddbcf355b8e3"} Apr 17 20:50:59.155703 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.155680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" event={"ID":"dbe6d7617e05cfcc664fd92a25ec45f3","Type":"ContainerStarted","Data":"b60f62e9cf0f4bd0436615840f70722688f89d993576d46fbac304af1f10da13"} Apr 17 20:50:59.156636 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.156618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ddpb6" event={"ID":"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c","Type":"ContainerStarted","Data":"4f86f134b8678d02c740746ca9ae261ff524d3f9f87b4a3da101828fae4a8ebb"} Apr 17 20:50:59.591887 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.591849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:50:59.592087 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:59.591992 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:59.592087 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:59.592061 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:01.592041095 +0000 UTC m=+6.059482026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:50:59.692971 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:50:59.692925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:50:59.693275 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:59.693256 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:50:59.693366 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:59.693287 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:50:59.693366 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:59.693304 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:50:59.693366 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:50:59.693365 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:01.693346687 +0000 UTC m=+6.160787616 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:00.131900 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:00.131869 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:00.132384 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:00.132002 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:00.132458 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:00.132433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:00.132544 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:00.132521 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:00.176912 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:00.176320 2575 generic.go:358] "Generic (PLEG): container finished" podID="77f82645937a0b11c10ce2a57afcb470" containerID="71aae59a2eb9f65313d810af9dacee3636da2546cc32988f11411a6d1bd9a5ee" exitCode=0 Apr 17 20:51:00.176912 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:00.176850 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" event={"ID":"77f82645937a0b11c10ce2a57afcb470","Type":"ContainerDied","Data":"71aae59a2eb9f65313d810af9dacee3636da2546cc32988f11411a6d1bd9a5ee"} Apr 17 20:51:00.194561 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:00.194484 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-12.ec2.internal" podStartSLOduration=4.194465233 podStartE2EDuration="4.194465233s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:50:59.169867119 +0000 UTC m=+3.637308049" watchObservedRunningTime="2026-04-17 20:51:00.194465233 +0000 UTC m=+4.661906164" Apr 17 20:51:01.194980 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:01.194937 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" event={"ID":"77f82645937a0b11c10ce2a57afcb470","Type":"ContainerStarted","Data":"bed185882735855895c125ce431bc4963b60489fe12e01dc9e8b81d59d5b0e2f"} Apr 17 20:51:01.608382 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:01.608341 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:01.608553 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:01.608494 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:01.608613 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:01.608560 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:05.608540837 +0000 UTC m=+10.075981764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:01.710048 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:01.709449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:01.710048 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:01.709621 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:51:01.710048 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:01.709640 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:51:01.710048 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:01.709653 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:01.710048 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:01.709712 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:05.709694489 +0000 UTC m=+10.177135415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:02.129995 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:02.129962 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:02.130166 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:02.130097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:02.130166 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:02.130119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:02.130316 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:02.130266 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:04.128861 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:04.128819 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:04.129394 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:04.128954 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:04.129394 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:04.128819 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:04.129394 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:04.129134 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:05.642722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:05.642612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:05.643311 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:05.642775 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:05.643311 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:05.642853 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:13.642831052 +0000 UTC m=+18.110271978 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:05.743890 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:05.743853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:05.744074 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:05.744047 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:51:05.744074 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:05.744072 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:51:05.744181 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:05.744087 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:05.744181 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:05.744148 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:13.744129414 +0000 UTC m=+18.211570339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:06.130606 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:06.129976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:06.130606 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:06.130106 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:06.130606 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:06.130471 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:06.130606 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:06.130564 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:08.129603 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:08.129567 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:08.130052 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:08.129683 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:08.130052 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:08.129753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:08.130052 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:08.129879 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:10.131839 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:10.131798 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:10.132235 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:10.131809 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:10.132235 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:10.131907 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:10.132235 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:10.132006 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:12.132238 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:12.132190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:12.132686 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:12.132199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:12.132686 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:12.132336 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:12.132686 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:12.132427 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:13.701819 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:13.701599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:13.701819 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:13.701795 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:13.702307 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:13.701871 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:29.701847882 +0000 UTC m=+34.169288802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 20:51:13.803064 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:13.803028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:13.803261 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:13.803210 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:51:13.803261 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:13.803248 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:51:13.803261 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:13.803259 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:13.803395 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:13.803317 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:29.80330033 +0000 UTC m=+34.270741249 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:14.129278 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:14.129245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:14.129423 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:14.129246 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:14.129423 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:14.129356 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:14.129516 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:14.129445 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:16.130517 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:16.130215 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:16.130517 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:16.130373 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:16.131015 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:16.130694 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:16.131015 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:16.130775 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:17.222424 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.222132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-84567" event={"ID":"8c8e47f1-f9e0-4eaf-aee0-87ec4156f3d7","Type":"ContainerStarted","Data":"f78f2d397d3d10005c62f0f2552d0c738e884b219f0baaff579dacc29c226abe"} Apr 17 20:51:17.223696 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.223665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-ddpb6" event={"ID":"5f57b5b6-848e-48b0-b1ca-e8a8c29c446c","Type":"ContainerStarted","Data":"16c48b40e8e22941ec45c26a43549ec20f86c505d05ba489e9da386b4453d4e4"} Apr 17 20:51:17.225083 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.225058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7rtzd" event={"ID":"2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236","Type":"ContainerStarted","Data":"0974eedc95e72792932cf27fc5a6649670727c58bb032f9fe769a86c52ed8d75"} Apr 17 20:51:17.226593 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.226568 2575 generic.go:358] "Generic (PLEG): container finished" podID="38d8ca4c-2571-491a-bb52-21191288881d" containerID="ee78bc3ad41e88dc51c33e7a680ad678b793b469f69722e4e21594e0ecc85ec9" exitCode=0 Apr 17 20:51:17.226695 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.226641 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerDied","Data":"ee78bc3ad41e88dc51c33e7a680ad678b793b469f69722e4e21594e0ecc85ec9"} Apr 17 20:51:17.228165 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.228143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" event={"ID":"710e13ba-1421-4758-a855-0e9df651899c","Type":"ContainerStarted","Data":"f54eff53f7e368c9feffe2badb9e5eb5834e7963075a6612d06759fe4b3dd9de"} Apr 17 20:51:17.231070 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231041 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 20:51:17.231512 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231474 2575 generic.go:358] "Generic (PLEG): container finished" podID="e7fec351-05e2-48e2-8266-8f2093ebb3fe" containerID="c6737340bfdacd51e7767c6404f95d743e9909adbb35e4a78066cb91d37dbf0c" exitCode=1 Apr 17 20:51:17.231602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231507 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"ebb001fbb06155d9ea7b8dc180831a9211fed679ca43c1bc1d642bcf5ed88dcb"} Apr 17 20:51:17.231602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"dd248d9ce30444a82eb050dd1e3a0276b28d3350bb22b99ed61a88c6fa684b4d"} Apr 17 20:51:17.231602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231549 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"15ec2a73713466628806d0956aa58b4ee9450a4332442196ba1fbb7c2429a4bd"} Apr 17 20:51:17.231602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"302c5b11899ebbd1b00708f244cc54585dde7bb2227170f6dcae2994ecb4c844"} Apr 17 20:51:17.231602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerDied","Data":"c6737340bfdacd51e7767c6404f95d743e9909adbb35e4a78066cb91d37dbf0c"} Apr 17 20:51:17.231602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.231588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"d17b8ac3106b5cf5bc318d8f193c88c68adf458fb041aba13aa528e50e89d012"} Apr 17 20:51:17.232975 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.232950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-47m56" event={"ID":"6560b30d-aca7-45a9-b8b5-3fb4711c4650","Type":"ContainerStarted","Data":"b13a7118aabdc9aa663b2e7ec2c7173472e76653bd827751c5fbde77d55b7d00"} Apr 17 20:51:17.234458 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.234434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-c796r" event={"ID":"743a363d-0753-4e3b-9c99-a494c15dcf32","Type":"ContainerStarted","Data":"4df1f3856da060557bfcac3f027a00f075514b82817fc00d22d2a0dccea20b31"} Apr 17 20:51:17.237380 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.237340 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-12.ec2.internal" podStartSLOduration=21.237326444 podStartE2EDuration="21.237326444s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:51:01.209503233 +0000 UTC m=+5.676944169" watchObservedRunningTime="2026-04-17 20:51:17.237326444 +0000 UTC m=+21.704767380" Apr 17 20:51:17.237687 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.237649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-84567" podStartSLOduration=3.941596212 podStartE2EDuration="21.237637726s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.800864075 +0000 UTC m=+3.268305002" lastFinishedPulling="2026-04-17 20:51:16.096905599 +0000 UTC m=+20.564346516" observedRunningTime="2026-04-17 20:51:17.237468575 +0000 UTC m=+21.704909500" watchObservedRunningTime="2026-04-17 20:51:17.237637726 +0000 UTC m=+21.705078663" Apr 17 20:51:17.251452 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.251411 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-c796r" podStartSLOduration=3.959910126 podStartE2EDuration="21.251395664s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.804189889 +0000 UTC m=+3.271630809" lastFinishedPulling="2026-04-17 20:51:16.095675431 +0000 UTC m=+20.563116347" observedRunningTime="2026-04-17 20:51:17.250798547 +0000 UTC m=+21.718239480" watchObservedRunningTime="2026-04-17 20:51:17.251395664 +0000 UTC m=+21.718836608" Apr 17 20:51:17.263205 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.263174 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-ddpb6" podStartSLOduration=12.148638477 podStartE2EDuration="21.263164436s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.812635982 +0000 UTC m=+3.280076895" lastFinishedPulling="2026-04-17 20:51:07.927161938 +0000 UTC m=+12.394602854" observedRunningTime="2026-04-17 20:51:17.263131596 +0000 UTC m=+21.730572530" watchObservedRunningTime="2026-04-17 20:51:17.263164436 +0000 UTC m=+21.730605368" Apr 17 20:51:17.278651 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.278618 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-47m56" podStartSLOduration=3.97560979 podStartE2EDuration="21.27860837s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.802485583 +0000 UTC m=+3.269926509" lastFinishedPulling="2026-04-17 20:51:16.105484167 +0000 UTC m=+20.572925089" observedRunningTime="2026-04-17 20:51:17.278185994 +0000 UTC m=+21.745626929" watchObservedRunningTime="2026-04-17 20:51:17.27860837 +0000 UTC m=+21.746049303" Apr 17 20:51:17.293114 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.293080 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7rtzd" podStartSLOduration=4.00911628 podStartE2EDuration="21.293070884s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.810499889 +0000 UTC m=+3.277940800" lastFinishedPulling="2026-04-17 20:51:16.094454486 +0000 UTC m=+20.561895404" observedRunningTime="2026-04-17 20:51:17.292668218 +0000 UTC m=+21.760109151" watchObservedRunningTime="2026-04-17 20:51:17.293070884 +0000 UTC m=+21.760511817" Apr 17 20:51:17.796263 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:17.796105 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 20:51:18.038109 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.038025 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T20:51:17.796257148Z","UUID":"ada2e888-3ecf-4144-bfa4-77f4e8cb779f","Handler":null,"Name":"","Endpoint":""} Apr 17 20:51:18.040974 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.040953 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 20:51:18.040974 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.040980 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 20:51:18.129462 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.129429 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:18.129654 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.129429 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:18.129654 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:18.129567 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:18.129654 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:18.129630 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:18.241943 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.241870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fh4xj" event={"ID":"76b2eed9-aba9-4fc0-8c47-32a44d9073bf","Type":"ContainerStarted","Data":"16efd9c157ed17bd6ffd6ca85ed0e1eee812643759daf9b00e04378645d523da"} Apr 17 20:51:18.244491 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.244455 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" event={"ID":"710e13ba-1421-4758-a855-0e9df651899c","Type":"ContainerStarted","Data":"0d4f33ac50698ddc7e6d98114292ac8ebfdc325cc8396f44b8d0284ef10e9a30"} Apr 17 20:51:18.255605 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:18.255554 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fh4xj" podStartSLOduration=4.998115688 podStartE2EDuration="22.255538418s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.808140183 +0000 UTC m=+3.275581100" lastFinishedPulling="2026-04-17 20:51:16.065562915 +0000 UTC m=+20.533003830" observedRunningTime="2026-04-17 20:51:18.255019707 +0000 UTC m=+22.722460640" watchObservedRunningTime="2026-04-17 20:51:18.255538418 +0000 UTC m=+22.722979351" Apr 17 20:51:19.249024 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:19.248950 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" event={"ID":"710e13ba-1421-4758-a855-0e9df651899c","Type":"ContainerStarted","Data":"42dbb455b1f4ee2ed99a212fba0cf3be144922cd52edfabfbd4e42a5adfde6f4"} Apr 17 20:51:19.251546 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:19.251525 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 20:51:19.251951 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:19.251921 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"852b0b796b89eb364e49e47cabd3af067a2a1176b7598c2323a86b9e29b16ab2"} Apr 17 20:51:19.264122 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:19.264090 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m2kpk" podStartSLOduration=3.059100929 podStartE2EDuration="23.264077818s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.808420887 +0000 UTC m=+3.275861797" lastFinishedPulling="2026-04-17 20:51:19.01339777 +0000 UTC m=+23.480838686" observedRunningTime="2026-04-17 20:51:19.264034797 +0000 UTC m=+23.731475730" watchObservedRunningTime="2026-04-17 20:51:19.264077818 +0000 UTC m=+23.731518750" Apr 17 20:51:20.129178 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:20.129145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:20.129380 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:20.129184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:20.129380 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:20.129294 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:20.129459 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:20.129417 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:20.771704 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:20.771665 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-c796r" Apr 17 20:51:20.772471 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:20.772454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-c796r" Apr 17 20:51:21.255802 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:21.255771 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-c796r" Apr 17 20:51:21.256354 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:21.256335 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-c796r" Apr 17 20:51:22.129457 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:22.129422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:22.129946 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:22.129555 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:22.129946 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:22.129607 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:22.129946 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:22.129719 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:23.260685 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.260503 2575 generic.go:358] "Generic (PLEG): container finished" podID="38d8ca4c-2571-491a-bb52-21191288881d" containerID="a78c62a3edef505713ab1188bb653fb0c8cc30ed1a5bdf340af6b6335bc744a4" exitCode=0 Apr 17 20:51:23.261158 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.260592 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerDied","Data":"a78c62a3edef505713ab1188bb653fb0c8cc30ed1a5bdf340af6b6335bc744a4"} Apr 17 20:51:23.263779 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.263764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 20:51:23.264165 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.264125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"ee3094699f099c2f09e16bc6354808a06c0f87402167096a1ea59ea533c7cc7e"} Apr 17 20:51:23.264593 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.264572 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:51:23.264679 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.264604 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:51:23.264736 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.264715 2575 scope.go:117] "RemoveContainer" containerID="c6737340bfdacd51e7767c6404f95d743e9909adbb35e4a78066cb91d37dbf0c" Apr 17 20:51:23.279298 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:23.279274 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:51:24.129322 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.129286 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:24.129322 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.129303 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:24.129525 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:24.129390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:24.129525 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:24.129503 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:24.269806 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.269781 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 20:51:24.270185 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.270132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" event={"ID":"e7fec351-05e2-48e2-8266-8f2093ebb3fe","Type":"ContainerStarted","Data":"7181d2a9b25507b63dd973eeee904d041fc326bc890671f3765a3775a375d055"} Apr 17 20:51:24.270390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.270373 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:51:24.284167 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.284143 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:51:24.297145 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.297101 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" podStartSLOduration=10.947864783 podStartE2EDuration="28.29708785s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.806917557 +0000 UTC m=+3.274358467" lastFinishedPulling="2026-04-17 20:51:16.156140607 +0000 UTC m=+20.623581534" observedRunningTime="2026-04-17 20:51:24.297088709 +0000 UTC m=+28.764529641" watchObservedRunningTime="2026-04-17 20:51:24.29708785 +0000 UTC m=+28.764528780" Apr 17 20:51:24.743734 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.743446 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zgp64"] Apr 17 20:51:24.743889 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.743797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:24.743889 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.743845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gn7zb"] Apr 17 20:51:24.743984 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:24.743907 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:24.743984 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:24.743945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:24.744065 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:24.744036 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:25.273479 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:25.273451 2575 generic.go:358] "Generic (PLEG): container finished" podID="38d8ca4c-2571-491a-bb52-21191288881d" containerID="fc60517a49ccc137be70b05c1613839abe869789aaec216db02923b17543a995" exitCode=0 Apr 17 20:51:25.273909 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:25.273526 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerDied","Data":"fc60517a49ccc137be70b05c1613839abe869789aaec216db02923b17543a995"} Apr 17 20:51:26.131305 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:26.131285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:26.131404 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:26.131370 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:26.277404 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:26.277374 2575 generic.go:358] "Generic (PLEG): container finished" podID="38d8ca4c-2571-491a-bb52-21191288881d" containerID="78794a256826c71599ea6d76e542d5760a1c0e64ebbb00286baac95b591b29af" exitCode=0 Apr 17 20:51:26.277963 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:26.277472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerDied","Data":"78794a256826c71599ea6d76e542d5760a1c0e64ebbb00286baac95b591b29af"} Apr 17 20:51:27.129490 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:27.129459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:27.129650 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:27.129586 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gn7zb" podUID="03974a46-a9e1-4161-8f82-8e72fdfcb759" Apr 17 20:51:28.129238 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.129188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:28.129666 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:28.129331 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zgp64" podUID="5c537771-95e6-4644-8ce6-c3997543ce01" Apr 17 20:51:28.900727 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.900698 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-12.ec2.internal" event="NodeReady" Apr 17 20:51:28.900880 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.900843 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 20:51:28.940963 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.940930 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qbn97"] Apr 17 20:51:28.968838 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.968810 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rxb59"] Apr 17 20:51:28.969036 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.969012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:28.971787 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.971760 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 20:51:28.972037 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.972020 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tswpv\"" Apr 17 20:51:28.972149 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.972039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 20:51:28.987590 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.987567 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qbn97"] Apr 17 20:51:28.987676 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.987597 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rxb59"] Apr 17 20:51:28.987737 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.987706 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:28.989987 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.989965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 20:51:28.990092 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.989965 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvfcm\"" Apr 17 20:51:28.990327 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.990308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 20:51:28.990416 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:28.990326 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 20:51:29.114192 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.114161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c9c4b6-99be-4373-933b-d44dfd308d32-config-volume\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.114192 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.114197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvjlt\" (UniqueName: \"kubernetes.io/projected/65c9c4b6-99be-4373-933b-d44dfd308d32-kube-api-access-rvjlt\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.114449 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.114238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmd4\" (UniqueName: \"kubernetes.io/projected/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-kube-api-access-wpmd4\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:29.114449 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.114362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.114449 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.114386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:29.114449 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.114431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65c9c4b6-99be-4373-933b-d44dfd308d32-tmp-dir\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.129264 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.129212 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:29.132118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.132098 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 20:51:29.132254 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.132106 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f22jz\"" Apr 17 20:51:29.215386 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.215563 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:29.215563 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65c9c4b6-99be-4373-933b-d44dfd308d32-tmp-dir\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.215563 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c9c4b6-99be-4373-933b-d44dfd308d32-config-volume\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.215563 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjlt\" (UniqueName: \"kubernetes.io/projected/65c9c4b6-99be-4373-933b-d44dfd308d32-kube-api-access-rvjlt\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.215563 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmd4\" (UniqueName: \"kubernetes.io/projected/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-kube-api-access-wpmd4\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:29.215563 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.215531 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:29.215838 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.215636 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:29.715616504 +0000 UTC m=+34.183057417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:51:29.215838 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.215771 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:29.215838 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.215813 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/65c9c4b6-99be-4373-933b-d44dfd308d32-tmp-dir\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.215962 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.215855 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:29.715831532 +0000 UTC m=+34.183272464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:51:29.216321 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.216295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65c9c4b6-99be-4373-933b-d44dfd308d32-config-volume\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.225848 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.225816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvjlt\" (UniqueName: \"kubernetes.io/projected/65c9c4b6-99be-4373-933b-d44dfd308d32-kube-api-access-rvjlt\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.226159 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.226136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmd4\" (UniqueName: \"kubernetes.io/projected/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-kube-api-access-wpmd4\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:29.718985 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.718950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:29.718985 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.718992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:29.719239 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.719015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:51:29.719239 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.719129 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:29.719239 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.719153 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:51:29.719239 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.719157 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:29.719239 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.719237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:01.719199249 +0000 UTC m=+66.186640174 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : secret "metrics-daemon-secret" not found Apr 17 20:51:29.719448 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.719259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:30.719249241 +0000 UTC m=+35.186690153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:51:29.719448 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.719274 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:30.719266623 +0000 UTC m=+35.186707534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:51:29.819961 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:29.819925 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:29.820145 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.820108 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 20:51:29.820145 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.820133 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 20:51:29.820145 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.820147 2575 projected.go:194] Error preparing data for projected volume kube-api-access-ggb7k for pod openshift-network-diagnostics/network-check-target-zgp64: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:29.820303 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:29.820208 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k podName:5c537771-95e6-4644-8ce6-c3997543ce01 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:01.820189273 +0000 UTC m=+66.287630184 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ggb7k" (UniqueName: "kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k") pod "network-check-target-zgp64" (UID: "5c537771-95e6-4644-8ce6-c3997543ce01") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 20:51:30.129013 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:30.128970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:51:30.131763 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:30.131732 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:51:30.132174 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:30.131836 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:51:30.132174 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:30.131739 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jrdfc\"" Apr 17 20:51:30.727647 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:30.727604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:30.727932 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:30.727656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:30.727932 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:30.727763 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:30.727932 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:30.727810 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:30.727932 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:30.727843 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:32.72782464 +0000 UTC m=+37.195265554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:51:30.727932 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:30.727867 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:32.727851663 +0000 UTC m=+37.195292591 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:51:32.743647 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:32.743396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:32.743647 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:32.743606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:32.744127 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:32.743544 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:32.744127 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:32.743730 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:32.744127 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:32.743733 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:36.743710921 +0000 UTC m=+41.211151845 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:51:32.744127 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:32.743779 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:36.743768317 +0000 UTC m=+41.211209228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:51:33.291535 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:33.291502 2575 generic.go:358] "Generic (PLEG): container finished" podID="38d8ca4c-2571-491a-bb52-21191288881d" containerID="1fc0f0f2c6f1df7116fa3282e16b5242cbb705c9aa5f35bb61885fcb22f61e88" exitCode=0 Apr 17 20:51:33.291703 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:33.291563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerDied","Data":"1fc0f0f2c6f1df7116fa3282e16b5242cbb705c9aa5f35bb61885fcb22f61e88"} Apr 17 20:51:34.296318 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:34.296279 2575 generic.go:358] "Generic (PLEG): container finished" podID="38d8ca4c-2571-491a-bb52-21191288881d" containerID="20a8c1bf2645750f8cc4c9d4b762f445da26d38f325953c83c10f2a0fc2f7771" exitCode=0 Apr 17 20:51:34.296675 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:34.296338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerDied","Data":"20a8c1bf2645750f8cc4c9d4b762f445da26d38f325953c83c10f2a0fc2f7771"} Apr 17 20:51:35.300615 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:35.300578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" event={"ID":"38d8ca4c-2571-491a-bb52-21191288881d","Type":"ContainerStarted","Data":"c31f2c1c018e74c5ee9b693129f4bc25321ce6485084c4a2eccc31e7349b3986"} Apr 17 20:51:35.320928 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:35.320869 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tgvzd" podStartSLOduration=5.981679197 podStartE2EDuration="39.320852706s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:50:58.809651365 +0000 UTC m=+3.277092281" lastFinishedPulling="2026-04-17 20:51:32.148824879 +0000 UTC m=+36.616265790" observedRunningTime="2026-04-17 20:51:35.319627471 +0000 UTC m=+39.787068404" watchObservedRunningTime="2026-04-17 20:51:35.320852706 +0000 UTC m=+39.788293639" Apr 17 20:51:36.770185 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:36.770144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:36.770185 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:36.770185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:36.770680 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:36.770313 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:36.770680 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:36.770333 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:36.770680 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:36.770379 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:44.77036194 +0000 UTC m=+49.237802855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:51:36.770680 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:36.770393 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:51:44.770387062 +0000 UTC m=+49.237827972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:51:44.826474 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:44.826430 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:51:44.826474 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:44.826469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:51:44.826878 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:44.826589 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:51:44.826878 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:44.826639 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:51:44.826878 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:44.826655 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:00.826638553 +0000 UTC m=+65.294079478 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:51:44.826878 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:51:44.826686 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:00.82667503 +0000 UTC m=+65.294115941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:51:56.288062 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:51:56.288030 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8fq5" Apr 17 20:52:00.838001 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:00.837941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:52:00.838001 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:00.837991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:52:00.838450 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:00.838087 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:52:00.838450 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:00.838101 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:52:00.838450 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:00.838143 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:32.838127638 +0000 UTC m=+97.305568552 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:52:00.838450 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:00.838168 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:32.838154257 +0000 UTC m=+97.305595167 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:52:01.743606 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.743559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:52:01.743781 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:01.743710 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 20:52:01.743820 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:01.743781 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs podName:03974a46-a9e1-4161-8f82-8e72fdfcb759 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:05.743764823 +0000 UTC m=+130.211205751 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs") pod "network-metrics-daemon-gn7zb" (UID: "03974a46-a9e1-4161-8f82-8e72fdfcb759") : secret "metrics-daemon-secret" not found Apr 17 20:52:01.844711 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.844677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:52:01.847275 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.847256 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 20:52:01.857167 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.857145 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 20:52:01.869118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.869091 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggb7k\" (UniqueName: \"kubernetes.io/projected/5c537771-95e6-4644-8ce6-c3997543ce01-kube-api-access-ggb7k\") pod \"network-check-target-zgp64\" (UID: \"5c537771-95e6-4644-8ce6-c3997543ce01\") " pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:52:01.941735 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.941704 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jrdfc\"" Apr 17 20:52:01.949957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:01.949938 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:52:02.068388 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:02.068350 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zgp64"] Apr 17 20:52:02.071625 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:52:02.071595 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c537771_95e6_4644_8ce6_c3997543ce01.slice/crio-aabc8e6da8e3f870f463581451e1f37ef759518f736c04b50b36e64ed3b60d60 WatchSource:0}: Error finding container aabc8e6da8e3f870f463581451e1f37ef759518f736c04b50b36e64ed3b60d60: Status 404 returned error can't find the container with id aabc8e6da8e3f870f463581451e1f37ef759518f736c04b50b36e64ed3b60d60 Apr 17 20:52:02.348850 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:02.348814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zgp64" event={"ID":"5c537771-95e6-4644-8ce6-c3997543ce01","Type":"ContainerStarted","Data":"aabc8e6da8e3f870f463581451e1f37ef759518f736c04b50b36e64ed3b60d60"} Apr 17 20:52:05.355835 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:05.355801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zgp64" event={"ID":"5c537771-95e6-4644-8ce6-c3997543ce01","Type":"ContainerStarted","Data":"4507b72087013900d7601ca36198f970e21f54059d2eb5b5e6a3eea3140e3149"} Apr 17 20:52:05.356212 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:05.355938 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:52:05.370645 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:05.370602 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zgp64" podStartSLOduration=66.467866444 podStartE2EDuration="1m9.370589232s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:52:02.075567264 +0000 UTC m=+66.543008188" lastFinishedPulling="2026-04-17 20:52:04.97829006 +0000 UTC m=+69.445730976" observedRunningTime="2026-04-17 20:52:05.37019874 +0000 UTC m=+69.837639672" watchObservedRunningTime="2026-04-17 20:52:05.370589232 +0000 UTC m=+69.838030156" Apr 17 20:52:32.842849 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:32.842793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:52:32.842849 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:32.842853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:52:32.843445 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:32.842932 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 20:52:32.843445 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:32.842987 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 20:52:32.843445 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:32.843000 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls podName:65c9c4b6-99be-4373-933b-d44dfd308d32 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:36.842985 +0000 UTC m=+161.310425915 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls") pod "dns-default-qbn97" (UID: "65c9c4b6-99be-4373-933b-d44dfd308d32") : secret "dns-default-metrics-tls" not found Apr 17 20:52:32.843445 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:32.843052 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert podName:b8b94abe-b5ba-4b0f-ae1f-63575ffbb062 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:36.843035715 +0000 UTC m=+161.310476632 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert") pod "ingress-canary-rxb59" (UID: "b8b94abe-b5ba-4b0f-ae1f-63575ffbb062") : secret "canary-serving-cert" not found Apr 17 20:52:36.359482 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:36.359451 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zgp64" Apr 17 20:52:45.371675 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.371637 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq"] Apr 17 20:52:45.376696 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.376673 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v47zc"] Apr 17 20:52:45.376840 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.376823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:45.379294 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.379272 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.379971 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.379953 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.380115 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.380098 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.380115 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.380120 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 20:52:45.380263 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.380121 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2ljwr\"" Apr 17 20:52:45.382409 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.382379 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 20:52:45.382502 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.382386 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.382652 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.382637 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.382767 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.382747 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-cx2rl\"" Apr 17 20:52:45.382855 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.382781 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 20:52:45.386526 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.386508 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq"] Apr 17 20:52:45.387787 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.387770 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 20:52:45.389697 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.389678 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v47zc"] Apr 17 20:52:45.432092 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.432060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:45.432285 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.432105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d30ac7a2-edef-43e4-a645-fbf9445df632-trusted-ca\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.432285 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.432154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d30ac7a2-edef-43e4-a645-fbf9445df632-serving-cert\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.432285 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.432204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r58c\" (UniqueName: \"kubernetes.io/projected/43e2babb-8445-4ad9-92e2-5a48de5e20ec-kube-api-access-2r58c\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:45.432285 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.432279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d30ac7a2-edef-43e4-a645-fbf9445df632-config\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.432422 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.432296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh92r\" (UniqueName: \"kubernetes.io/projected/d30ac7a2-edef-43e4-a645-fbf9445df632-kube-api-access-fh92r\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.472386 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.472357 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p"] Apr 17 20:52:45.475692 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.475671 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-75858d479b-s6j25"] Apr 17 20:52:45.475824 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.475805 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" Apr 17 20:52:45.478533 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.478512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-kj8hg\"" Apr 17 20:52:45.478877 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.478857 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.478965 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.478894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.479393 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.479375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.481200 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.481182 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 20:52:45.481691 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.481672 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 20:52:45.482002 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.481987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.482406 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.482383 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 20:52:45.483038 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.483022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vq7ww\"" Apr 17 20:52:45.483290 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.483272 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.483399 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.483380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 20:52:45.483486 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.483468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p"] Apr 17 20:52:45.492953 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.492931 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-75858d479b-s6j25"] Apr 17 20:52:45.532684 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532643 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkqxd\" (UniqueName: \"kubernetes.io/projected/4a34696e-71c8-4053-bb98-047cc6c77df2-kube-api-access-pkqxd\") pod \"volume-data-source-validator-7c6cbb6c87-svg9p\" (UID: \"4a34696e-71c8-4053-bb98-047cc6c77df2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" Apr 17 20:52:45.532834 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d30ac7a2-edef-43e4-a645-fbf9445df632-config\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.532834 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532728 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh92r\" (UniqueName: \"kubernetes.io/projected/d30ac7a2-edef-43e4-a645-fbf9445df632-kube-api-access-fh92r\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.532834 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.532976 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4284\" (UniqueName: \"kubernetes.io/projected/591063be-61cb-4346-b946-b6ad0d833153-kube-api-access-d4284\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.532976 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532877 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:45.532976 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-default-certificate\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.532976 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-stats-auth\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.532976 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.532976 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.532965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d30ac7a2-edef-43e4-a645-fbf9445df632-trusted-ca\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.533273 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.533002 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:52:45.533273 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.533030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d30ac7a2-edef-43e4-a645-fbf9445df632-serving-cert\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.533273 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.533081 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls podName:43e2babb-8445-4ad9-92e2-5a48de5e20ec nodeName:}" failed. No retries permitted until 2026-04-17 20:52:46.033059642 +0000 UTC m=+110.500500566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g7ctq" (UID: "43e2babb-8445-4ad9-92e2-5a48de5e20ec") : secret "samples-operator-tls" not found Apr 17 20:52:45.533273 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.533240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r58c\" (UniqueName: \"kubernetes.io/projected/43e2babb-8445-4ad9-92e2-5a48de5e20ec-kube-api-access-2r58c\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:45.533560 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.533533 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d30ac7a2-edef-43e4-a645-fbf9445df632-config\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.533958 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.533939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d30ac7a2-edef-43e4-a645-fbf9445df632-trusted-ca\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.535402 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.535377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d30ac7a2-edef-43e4-a645-fbf9445df632-serving-cert\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.541948 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.541919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh92r\" (UniqueName: \"kubernetes.io/projected/d30ac7a2-edef-43e4-a645-fbf9445df632-kube-api-access-fh92r\") pod \"console-operator-9d4b6777b-v47zc\" (UID: \"d30ac7a2-edef-43e4-a645-fbf9445df632\") " pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.542820 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.542804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r58c\" (UniqueName: \"kubernetes.io/projected/43e2babb-8445-4ad9-92e2-5a48de5e20ec-kube-api-access-2r58c\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:45.581071 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.581038 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n"] Apr 17 20:52:45.585490 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.585474 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.588002 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.587976 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 20:52:45.588156 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.588084 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-pglw7\"" Apr 17 20:52:45.588156 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.588090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 20:52:45.589122 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.589108 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 20:52:45.589206 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.589112 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 20:52:45.608626 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.608598 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n"] Apr 17 20:52:45.634308 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.634308 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634259 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4284\" (UniqueName: \"kubernetes.io/projected/591063be-61cb-4346-b946-b6ad0d833153-kube-api-access-d4284\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.634527 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.634355 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:52:45.634527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-default-certificate\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.634527 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.634420 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:46.134404848 +0000 UTC m=+110.601845764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : secret "router-metrics-certs-default" not found Apr 17 20:52:45.634527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-stats-auth\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.634527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.634527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634521 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.634744 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634572 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcntz\" (UniqueName: \"kubernetes.io/projected/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-kube-api-access-wcntz\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.634744 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.634643 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:46.134631085 +0000 UTC m=+110.602071995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : configmap references non-existent config key: service-ca.crt Apr 17 20:52:45.634744 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkqxd\" (UniqueName: \"kubernetes.io/projected/4a34696e-71c8-4053-bb98-047cc6c77df2-kube-api-access-pkqxd\") pod \"volume-data-source-validator-7c6cbb6c87-svg9p\" (UID: \"4a34696e-71c8-4053-bb98-047cc6c77df2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" Apr 17 20:52:45.634744 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.634693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.637537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.637514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-stats-auth\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.637537 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.637525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-default-certificate\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.652883 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.652860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4284\" (UniqueName: \"kubernetes.io/projected/591063be-61cb-4346-b946-b6ad0d833153-kube-api-access-d4284\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:45.653200 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.653182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkqxd\" (UniqueName: \"kubernetes.io/projected/4a34696e-71c8-4053-bb98-047cc6c77df2-kube-api-access-pkqxd\") pod \"volume-data-source-validator-7c6cbb6c87-svg9p\" (UID: \"4a34696e-71c8-4053-bb98-047cc6c77df2\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" Apr 17 20:52:45.693039 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.693005 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:45.735746 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.735714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.735906 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.735764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcntz\" (UniqueName: \"kubernetes.io/projected/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-kube-api-access-wcntz\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.735906 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.735813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.735974 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.735933 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:45.736007 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:45.735989 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls podName:99cf7a02-6c85-4f4e-b81d-d984d3c34a8c nodeName:}" failed. No retries permitted until 2026-04-17 20:52:46.23597585 +0000 UTC m=+110.703416761 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ktp5n" (UID: "99cf7a02-6c85-4f4e-b81d-d984d3c34a8c") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:45.736883 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.736864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.749052 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.748979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcntz\" (UniqueName: \"kubernetes.io/projected/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-kube-api-access-wcntz\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:45.786476 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.786368 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" Apr 17 20:52:45.827925 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.827897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-v47zc"] Apr 17 20:52:45.830514 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:52:45.830487 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30ac7a2_edef_43e4_a645_fbf9445df632.slice/crio-2ebe9a15319e5bd8c6fc629cc9c5ed37e0131032c7aa3c0144c1cebb458ca0d9 WatchSource:0}: Error finding container 2ebe9a15319e5bd8c6fc629cc9c5ed37e0131032c7aa3c0144c1cebb458ca0d9: Status 404 returned error can't find the container with id 2ebe9a15319e5bd8c6fc629cc9c5ed37e0131032c7aa3c0144c1cebb458ca0d9 Apr 17 20:52:45.897042 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:45.896945 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p"] Apr 17 20:52:45.900419 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:52:45.900388 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a34696e_71c8_4053_bb98_047cc6c77df2.slice/crio-1cd7574db616e2b30147c64b8d9298c43811572a696ba3df0e372ff527e00885 WatchSource:0}: Error finding container 1cd7574db616e2b30147c64b8d9298c43811572a696ba3df0e372ff527e00885: Status 404 returned error can't find the container with id 1cd7574db616e2b30147c64b8d9298c43811572a696ba3df0e372ff527e00885 Apr 17 20:52:46.037609 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:46.037570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:46.037802 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.037720 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:52:46.037802 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.037794 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls podName:43e2babb-8445-4ad9-92e2-5a48de5e20ec nodeName:}" failed. No retries permitted until 2026-04-17 20:52:47.037774886 +0000 UTC m=+111.505215813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g7ctq" (UID: "43e2babb-8445-4ad9-92e2-5a48de5e20ec") : secret "samples-operator-tls" not found Apr 17 20:52:46.138506 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:46.138471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:46.138684 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:46.138519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:46.138684 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.138610 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:52:46.138684 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.138641 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:47.138625389 +0000 UTC m=+111.606066313 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : configmap references non-existent config key: service-ca.crt Apr 17 20:52:46.138684 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.138665 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:47.138650908 +0000 UTC m=+111.606091822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : secret "router-metrics-certs-default" not found Apr 17 20:52:46.239495 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:46.239411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:46.239644 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.239516 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:46.239644 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:46.239565 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls podName:99cf7a02-6c85-4f4e-b81d-d984d3c34a8c nodeName:}" failed. No retries permitted until 2026-04-17 20:52:47.239551664 +0000 UTC m=+111.706992575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ktp5n" (UID: "99cf7a02-6c85-4f4e-b81d-d984d3c34a8c") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:46.430978 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:46.430931 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" event={"ID":"4a34696e-71c8-4053-bb98-047cc6c77df2","Type":"ContainerStarted","Data":"1cd7574db616e2b30147c64b8d9298c43811572a696ba3df0e372ff527e00885"} Apr 17 20:52:46.432075 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:46.432049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" event={"ID":"d30ac7a2-edef-43e4-a645-fbf9445df632","Type":"ContainerStarted","Data":"2ebe9a15319e5bd8c6fc629cc9c5ed37e0131032c7aa3c0144c1cebb458ca0d9"} Apr 17 20:52:47.047069 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:47.047024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:47.047274 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.047192 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:52:47.047355 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.047297 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls podName:43e2babb-8445-4ad9-92e2-5a48de5e20ec nodeName:}" failed. No retries permitted until 2026-04-17 20:52:49.047271027 +0000 UTC m=+113.514711938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g7ctq" (UID: "43e2babb-8445-4ad9-92e2-5a48de5e20ec") : secret "samples-operator-tls" not found Apr 17 20:52:47.148122 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:47.148077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:47.148322 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:47.148144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:47.148322 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.148238 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:52:47.148322 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.148312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:49.148294574 +0000 UTC m=+113.615735498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : secret "router-metrics-certs-default" not found Apr 17 20:52:47.148459 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.148325 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:49.148319744 +0000 UTC m=+113.615760655 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : configmap references non-existent config key: service-ca.crt Apr 17 20:52:47.248923 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:47.248886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:47.249114 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.249049 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:47.249184 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:47.249129 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls podName:99cf7a02-6c85-4f4e-b81d-d984d3c34a8c nodeName:}" failed. No retries permitted until 2026-04-17 20:52:49.249109038 +0000 UTC m=+113.716549961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ktp5n" (UID: "99cf7a02-6c85-4f4e-b81d-d984d3c34a8c") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:48.437669 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:48.437631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" event={"ID":"4a34696e-71c8-4053-bb98-047cc6c77df2","Type":"ContainerStarted","Data":"f8c1dc81e0c9adee6ddaeee5962825f8678784606ed7807bbe5e5edc4f2e8f9f"} Apr 17 20:52:48.438990 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:48.438971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/0.log" Apr 17 20:52:48.439085 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:48.439010 2575 generic.go:358] "Generic (PLEG): container finished" podID="d30ac7a2-edef-43e4-a645-fbf9445df632" containerID="4f0855d227268a4ef0b3b8b961d56e86a3cd122ed972bfe2aadecb4e582da0cb" exitCode=255 Apr 17 20:52:48.439085 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:48.439038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" event={"ID":"d30ac7a2-edef-43e4-a645-fbf9445df632","Type":"ContainerDied","Data":"4f0855d227268a4ef0b3b8b961d56e86a3cd122ed972bfe2aadecb4e582da0cb"} Apr 17 20:52:48.439267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:48.439254 2575 scope.go:117] "RemoveContainer" containerID="4f0855d227268a4ef0b3b8b961d56e86a3cd122ed972bfe2aadecb4e582da0cb" Apr 17 20:52:48.449375 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:48.449337 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-svg9p" podStartSLOduration=1.6886680379999999 podStartE2EDuration="3.449324831s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:52:45.902207111 +0000 UTC m=+110.369648025" lastFinishedPulling="2026-04-17 20:52:47.662863891 +0000 UTC m=+112.130304818" observedRunningTime="2026-04-17 20:52:48.448872702 +0000 UTC m=+112.916313634" watchObservedRunningTime="2026-04-17 20:52:48.449324831 +0000 UTC m=+112.916765764" Apr 17 20:52:49.062151 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.062103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:49.062355 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.062274 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:52:49.062355 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.062343 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls podName:43e2babb-8445-4ad9-92e2-5a48de5e20ec nodeName:}" failed. No retries permitted until 2026-04-17 20:52:53.062327471 +0000 UTC m=+117.529768382 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g7ctq" (UID: "43e2babb-8445-4ad9-92e2-5a48de5e20ec") : secret "samples-operator-tls" not found Apr 17 20:52:49.162791 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.162745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:49.162904 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.162812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:49.162958 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.162915 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:52:49.162958 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.162933 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:53.162914319 +0000 UTC m=+117.630355233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : configmap references non-existent config key: service-ca.crt Apr 17 20:52:49.163049 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.162977 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:52:53.162959778 +0000 UTC m=+117.630400706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : secret "router-metrics-certs-default" not found Apr 17 20:52:49.264060 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.264021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:49.264187 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.264165 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:49.264265 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.264254 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls podName:99cf7a02-6c85-4f4e-b81d-d984d3c34a8c nodeName:}" failed. No retries permitted until 2026-04-17 20:52:53.264237542 +0000 UTC m=+117.731678466 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ktp5n" (UID: "99cf7a02-6c85-4f4e-b81d-d984d3c34a8c") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:49.441820 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.441748 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/1.log" Apr 17 20:52:49.442193 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.442109 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/0.log" Apr 17 20:52:49.442193 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.442140 2575 generic.go:358] "Generic (PLEG): container finished" podID="d30ac7a2-edef-43e4-a645-fbf9445df632" containerID="5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853" exitCode=255 Apr 17 20:52:49.442193 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.442172 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" event={"ID":"d30ac7a2-edef-43e4-a645-fbf9445df632","Type":"ContainerDied","Data":"5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853"} Apr 17 20:52:49.442323 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.442212 2575 scope.go:117] "RemoveContainer" containerID="4f0855d227268a4ef0b3b8b961d56e86a3cd122ed972bfe2aadecb4e582da0cb" Apr 17 20:52:49.442493 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:49.442474 2575 scope.go:117] "RemoveContainer" containerID="5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853" Apr 17 20:52:49.442685 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:49.442663 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v47zc_openshift-console-operator(d30ac7a2-edef-43e4-a645-fbf9445df632)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podUID="d30ac7a2-edef-43e4-a645-fbf9445df632" Apr 17 20:52:50.445309 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:50.445279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/1.log" Apr 17 20:52:50.445682 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:50.445632 2575 scope.go:117] "RemoveContainer" containerID="5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853" Apr 17 20:52:50.445798 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:50.445781 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v47zc_openshift-console-operator(d30ac7a2-edef-43e4-a645-fbf9445df632)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podUID="d30ac7a2-edef-43e4-a645-fbf9445df632" Apr 17 20:52:51.137872 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.137839 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx"] Apr 17 20:52:51.141852 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.141836 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" Apr 17 20:52:51.144042 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.144020 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-q8hfb\"" Apr 17 20:52:51.150366 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.150347 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx"] Apr 17 20:52:51.181848 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.181805 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvrj\" (UniqueName: \"kubernetes.io/projected/8b78fb10-ec75-4579-95cd-a89556a6bc0f-kube-api-access-gpvrj\") pod \"network-check-source-8894fc9bd-4lcdx\" (UID: \"8b78fb10-ec75-4579-95cd-a89556a6bc0f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" Apr 17 20:52:51.282871 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.282836 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvrj\" (UniqueName: \"kubernetes.io/projected/8b78fb10-ec75-4579-95cd-a89556a6bc0f-kube-api-access-gpvrj\") pod \"network-check-source-8894fc9bd-4lcdx\" (UID: \"8b78fb10-ec75-4579-95cd-a89556a6bc0f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" Apr 17 20:52:51.290810 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.290779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvrj\" (UniqueName: \"kubernetes.io/projected/8b78fb10-ec75-4579-95cd-a89556a6bc0f-kube-api-access-gpvrj\") pod \"network-check-source-8894fc9bd-4lcdx\" (UID: \"8b78fb10-ec75-4579-95cd-a89556a6bc0f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" Apr 17 20:52:51.449736 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.449665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" Apr 17 20:52:51.562916 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:51.562884 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx"] Apr 17 20:52:51.566403 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:52:51.566377 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b78fb10_ec75_4579_95cd_a89556a6bc0f.slice/crio-ccaf0f4c3312d59a25213a84b715b3fb3c9fd388bd41d93cd1325fe9ace7fc09 WatchSource:0}: Error finding container ccaf0f4c3312d59a25213a84b715b3fb3c9fd388bd41d93cd1325fe9ace7fc09: Status 404 returned error can't find the container with id ccaf0f4c3312d59a25213a84b715b3fb3c9fd388bd41d93cd1325fe9ace7fc09 Apr 17 20:52:52.008176 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.008138 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dxl2j"] Apr 17 20:52:52.011421 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.011402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.013788 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.013761 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 20:52:52.013917 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.013764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qnrgx\"" Apr 17 20:52:52.013917 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.013795 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 20:52:52.014741 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.014725 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 20:52:52.014804 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.014757 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 20:52:52.020281 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.020261 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dxl2j"] Apr 17 20:52:52.087808 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.087769 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7dj5\" (UniqueName: \"kubernetes.io/projected/aace9850-df38-4122-ae2a-8f7e9d1a7373-kube-api-access-h7dj5\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.087999 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.087838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/aace9850-df38-4122-ae2a-8f7e9d1a7373-signing-key\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.087999 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.087903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/aace9850-df38-4122-ae2a-8f7e9d1a7373-signing-cabundle\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.188793 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.188752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/aace9850-df38-4122-ae2a-8f7e9d1a7373-signing-cabundle\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.188961 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.188855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7dj5\" (UniqueName: \"kubernetes.io/projected/aace9850-df38-4122-ae2a-8f7e9d1a7373-kube-api-access-h7dj5\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.188961 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.188916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/aace9850-df38-4122-ae2a-8f7e9d1a7373-signing-key\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.189540 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.189521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/aace9850-df38-4122-ae2a-8f7e9d1a7373-signing-cabundle\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.191201 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.191184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/aace9850-df38-4122-ae2a-8f7e9d1a7373-signing-key\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.197667 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.197648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7dj5\" (UniqueName: \"kubernetes.io/projected/aace9850-df38-4122-ae2a-8f7e9d1a7373-kube-api-access-h7dj5\") pod \"service-ca-865cb79987-dxl2j\" (UID: \"aace9850-df38-4122-ae2a-8f7e9d1a7373\") " pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.321163 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.321125 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dxl2j" Apr 17 20:52:52.435337 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.435306 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dxl2j"] Apr 17 20:52:52.438058 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:52:52.438032 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaace9850_df38_4122_ae2a_8f7e9d1a7373.slice/crio-2c77ff08b3677d4f14f9ccfb020b528955c4c0ef5b3cefc5dfe788c68f370c1c WatchSource:0}: Error finding container 2c77ff08b3677d4f14f9ccfb020b528955c4c0ef5b3cefc5dfe788c68f370c1c: Status 404 returned error can't find the container with id 2c77ff08b3677d4f14f9ccfb020b528955c4c0ef5b3cefc5dfe788c68f370c1c Apr 17 20:52:52.449365 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.449341 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dxl2j" event={"ID":"aace9850-df38-4122-ae2a-8f7e9d1a7373","Type":"ContainerStarted","Data":"2c77ff08b3677d4f14f9ccfb020b528955c4c0ef5b3cefc5dfe788c68f370c1c"} Apr 17 20:52:52.450487 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.450464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" event={"ID":"8b78fb10-ec75-4579-95cd-a89556a6bc0f","Type":"ContainerStarted","Data":"c56d58326d2cc302ecf017160c543dba892635d3951cf9d743a3d5ffcf31aeb4"} Apr 17 20:52:52.450783 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.450496 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" event={"ID":"8b78fb10-ec75-4579-95cd-a89556a6bc0f","Type":"ContainerStarted","Data":"ccaf0f4c3312d59a25213a84b715b3fb3c9fd388bd41d93cd1325fe9ace7fc09"} Apr 17 20:52:52.465507 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.465467 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4lcdx" podStartSLOduration=1.465453459 podStartE2EDuration="1.465453459s" podCreationTimestamp="2026-04-17 20:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:52:52.464669035 +0000 UTC m=+116.932109971" watchObservedRunningTime="2026-04-17 20:52:52.465453459 +0000 UTC m=+116.932894392" Apr 17 20:52:52.541459 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:52.541429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7rtzd_2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236/dns-node-resolver/0.log" Apr 17 20:52:53.096468 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:53.096432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:52:53.096645 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.096619 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 20:52:53.096704 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.096686 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls podName:43e2babb-8445-4ad9-92e2-5a48de5e20ec nodeName:}" failed. No retries permitted until 2026-04-17 20:53:01.096668212 +0000 UTC m=+125.564109128 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-g7ctq" (UID: "43e2babb-8445-4ad9-92e2-5a48de5e20ec") : secret "samples-operator-tls" not found Apr 17 20:52:53.197432 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:53.197392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:53.197614 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:53.197450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:52:53.197614 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.197534 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 20:52:53.197614 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.197602 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:01.197585951 +0000 UTC m=+125.665026879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : secret "router-metrics-certs-default" not found Apr 17 20:52:53.197768 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.197620 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:01.197612903 +0000 UTC m=+125.665053814 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : configmap references non-existent config key: service-ca.crt Apr 17 20:52:53.298178 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:53.298145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:52:53.298353 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.298300 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:53.298406 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:53.298371 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls podName:99cf7a02-6c85-4f4e-b81d-d984d3c34a8c nodeName:}" failed. No retries permitted until 2026-04-17 20:53:01.298352155 +0000 UTC m=+125.765793065 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ktp5n" (UID: "99cf7a02-6c85-4f4e-b81d-d984d3c34a8c") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:52:53.541739 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:53.541705 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ddpb6_5f57b5b6-848e-48b0-b1ca-e8a8c29c446c/node-ca/0.log" Apr 17 20:52:54.457623 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:54.457588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dxl2j" event={"ID":"aace9850-df38-4122-ae2a-8f7e9d1a7373","Type":"ContainerStarted","Data":"0a8fdf41b3878f6c873bb1551b61a79941b35ef0121fc68f738c70135f215b08"} Apr 17 20:52:54.472197 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:54.472138 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-dxl2j" podStartSLOduration=1.687396787 podStartE2EDuration="3.472120696s" podCreationTimestamp="2026-04-17 20:52:51 +0000 UTC" firstStartedPulling="2026-04-17 20:52:52.439819108 +0000 UTC m=+116.907260020" lastFinishedPulling="2026-04-17 20:52:54.224543007 +0000 UTC m=+118.691983929" observedRunningTime="2026-04-17 20:52:54.471059756 +0000 UTC m=+118.938500689" watchObservedRunningTime="2026-04-17 20:52:54.472120696 +0000 UTC m=+118.939561632" Apr 17 20:52:55.693337 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:55.693295 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:55.693337 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:55.693348 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:52:55.693870 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:52:55.693787 2575 scope.go:117] "RemoveContainer" containerID="5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853" Apr 17 20:52:55.694021 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:52:55.693997 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-v47zc_openshift-console-operator(d30ac7a2-edef-43e4-a645-fbf9445df632)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podUID="d30ac7a2-edef-43e4-a645-fbf9445df632" Apr 17 20:53:01.160301 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.160267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:53:01.162773 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.162741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/43e2babb-8445-4ad9-92e2-5a48de5e20ec-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-g7ctq\" (UID: \"43e2babb-8445-4ad9-92e2-5a48de5e20ec\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:53:01.261479 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.261437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:01.261675 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.261489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:01.261675 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:01.261661 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle podName:591063be-61cb-4346-b946-b6ad0d833153 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:17.261641187 +0000 UTC m=+141.729082117 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle") pod "router-default-75858d479b-s6j25" (UID: "591063be-61cb-4346-b946-b6ad0d833153") : configmap references non-existent config key: service-ca.crt Apr 17 20:53:01.263793 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.263768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/591063be-61cb-4346-b946-b6ad0d833153-metrics-certs\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:01.291008 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.290982 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2ljwr\"" Apr 17 20:53:01.298987 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.298968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" Apr 17 20:53:01.362446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.362331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:53:01.362604 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:01.362448 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 20:53:01.362604 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:01.362533 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls podName:99cf7a02-6c85-4f4e-b81d-d984d3c34a8c nodeName:}" failed. No retries permitted until 2026-04-17 20:53:17.362513239 +0000 UTC m=+141.829954150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ktp5n" (UID: "99cf7a02-6c85-4f4e-b81d-d984d3c34a8c") : secret "cluster-monitoring-operator-tls" not found Apr 17 20:53:01.415808 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.415785 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq"] Apr 17 20:53:01.476333 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:01.476304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" event={"ID":"43e2babb-8445-4ad9-92e2-5a48de5e20ec","Type":"ContainerStarted","Data":"d6da075476ca32e2e2e878b653087294531efe6f496260f091441491f5e3e77d"} Apr 17 20:53:03.483788 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:03.483743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" event={"ID":"43e2babb-8445-4ad9-92e2-5a48de5e20ec","Type":"ContainerStarted","Data":"afcaa67450190bb83dc75f37075e3c8a52493320c806b88feb1190fff1056357"} Apr 17 20:53:03.483788 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:03.483785 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" event={"ID":"43e2babb-8445-4ad9-92e2-5a48de5e20ec","Type":"ContainerStarted","Data":"b5ab2f00ae50d4f13e6ae9c5f7960fec3dc120a2002b92f54190fbc5f8ee58c6"} Apr 17 20:53:03.498433 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:03.498386 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-g7ctq" podStartSLOduration=17.004476851 podStartE2EDuration="18.498372913s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:53:01.452440229 +0000 UTC m=+125.919881157" lastFinishedPulling="2026-04-17 20:53:02.946336305 +0000 UTC m=+127.413777219" observedRunningTime="2026-04-17 20:53:03.498070582 +0000 UTC m=+127.965511515" watchObservedRunningTime="2026-04-17 20:53:03.498372913 +0000 UTC m=+127.965813879" Apr 17 20:53:05.799466 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:05.799435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:53:05.803356 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:05.803328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/03974a46-a9e1-4161-8f82-8e72fdfcb759-metrics-certs\") pod \"network-metrics-daemon-gn7zb\" (UID: \"03974a46-a9e1-4161-8f82-8e72fdfcb759\") " pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:53:06.042079 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:06.042039 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-f22jz\"" Apr 17 20:53:06.049824 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:06.049762 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gn7zb" Apr 17 20:53:06.166638 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:06.166604 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gn7zb"] Apr 17 20:53:06.169583 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:06.169559 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03974a46_a9e1_4161_8f82_8e72fdfcb759.slice/crio-47d5cdf4ad489f56cc803da33485e4f6373c74d21dc03bcb6f35732611abd6a4 WatchSource:0}: Error finding container 47d5cdf4ad489f56cc803da33485e4f6373c74d21dc03bcb6f35732611abd6a4: Status 404 returned error can't find the container with id 47d5cdf4ad489f56cc803da33485e4f6373c74d21dc03bcb6f35732611abd6a4 Apr 17 20:53:06.491642 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:06.491557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gn7zb" event={"ID":"03974a46-a9e1-4161-8f82-8e72fdfcb759","Type":"ContainerStarted","Data":"47d5cdf4ad489f56cc803da33485e4f6373c74d21dc03bcb6f35732611abd6a4"} Apr 17 20:53:07.500057 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:07.499972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gn7zb" event={"ID":"03974a46-a9e1-4161-8f82-8e72fdfcb759","Type":"ContainerStarted","Data":"7dcf38454c655d8f17788317ae819dd5aaaa172b6a5b5181f51b83350fc7e9c7"} Apr 17 20:53:07.500057 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:07.500011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gn7zb" event={"ID":"03974a46-a9e1-4161-8f82-8e72fdfcb759","Type":"ContainerStarted","Data":"711bb37b4f6965bbd1ef7949a7e9e0d07272ffafedd1bfb2cf9841517aa82aae"} Apr 17 20:53:07.518270 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:07.518214 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gn7zb" podStartSLOduration=130.547907464 podStartE2EDuration="2m11.51820065s" podCreationTimestamp="2026-04-17 20:50:56 +0000 UTC" firstStartedPulling="2026-04-17 20:53:06.171772373 +0000 UTC m=+130.639213284" lastFinishedPulling="2026-04-17 20:53:07.142065551 +0000 UTC m=+131.609506470" observedRunningTime="2026-04-17 20:53:07.517365537 +0000 UTC m=+131.984806469" watchObservedRunningTime="2026-04-17 20:53:07.51820065 +0000 UTC m=+131.985641574" Apr 17 20:53:10.129703 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.129664 2575 scope.go:117] "RemoveContainer" containerID="5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853" Apr 17 20:53:10.510322 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.510240 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 20:53:10.510633 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.510615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/1.log" Apr 17 20:53:10.510685 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.510650 2575 generic.go:358] "Generic (PLEG): container finished" podID="d30ac7a2-edef-43e4-a645-fbf9445df632" containerID="cd851ab049011ecb9c4e5c9a396e70f6d7e3da2092cb20c6ba4e71154b397812" exitCode=255 Apr 17 20:53:10.510685 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.510676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" event={"ID":"d30ac7a2-edef-43e4-a645-fbf9445df632","Type":"ContainerDied","Data":"cd851ab049011ecb9c4e5c9a396e70f6d7e3da2092cb20c6ba4e71154b397812"} Apr 17 20:53:10.510745 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.510701 2575 scope.go:117] "RemoveContainer" containerID="5cbea29e348c53a6ee579a46679d18cf5661f2998c10d14fc92e2eb766c0b853" Apr 17 20:53:10.511013 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.510997 2575 scope.go:117] "RemoveContainer" containerID="cd851ab049011ecb9c4e5c9a396e70f6d7e3da2092cb20c6ba4e71154b397812" Apr 17 20:53:10.511200 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:10.511181 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-v47zc_openshift-console-operator(d30ac7a2-edef-43e4-a645-fbf9445df632)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podUID="d30ac7a2-edef-43e4-a645-fbf9445df632" Apr 17 20:53:10.748078 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.748042 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6jl5b"] Apr 17 20:53:10.751446 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.751422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.754236 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.754204 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 20:53:10.754872 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.754853 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 20:53:10.755272 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.755258 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 20:53:10.755540 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.755518 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 20:53:10.755540 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.755535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-j2pjg\"" Apr 17 20:53:10.761156 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.761093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6jl5b"] Apr 17 20:53:10.842496 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.842463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5a0eab0c-9c9b-4c99-9117-ed93ab036378-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.842496 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.842499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5a0eab0c-9c9b-4c99-9117-ed93ab036378-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.842735 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.842615 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5a0eab0c-9c9b-4c99-9117-ed93ab036378-crio-socket\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.842735 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.842650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvrr\" (UniqueName: \"kubernetes.io/projected/5a0eab0c-9c9b-4c99-9117-ed93ab036378-kube-api-access-kbvrr\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.842735 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.842718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5a0eab0c-9c9b-4c99-9117-ed93ab036378-data-volume\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.943924 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.943891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5a0eab0c-9c9b-4c99-9117-ed93ab036378-crio-socket\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.943924 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.943923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvrr\" (UniqueName: \"kubernetes.io/projected/5a0eab0c-9c9b-4c99-9117-ed93ab036378-kube-api-access-kbvrr\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.944122 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.943957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5a0eab0c-9c9b-4c99-9117-ed93ab036378-data-volume\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.944122 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.944021 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5a0eab0c-9c9b-4c99-9117-ed93ab036378-crio-socket\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.944194 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.944135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5a0eab0c-9c9b-4c99-9117-ed93ab036378-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.944194 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.944170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5a0eab0c-9c9b-4c99-9117-ed93ab036378-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.944353 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.944337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5a0eab0c-9c9b-4c99-9117-ed93ab036378-data-volume\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.944670 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.944654 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5a0eab0c-9c9b-4c99-9117-ed93ab036378-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.946465 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.946441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5a0eab0c-9c9b-4c99-9117-ed93ab036378-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:10.956190 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:10.956169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvrr\" (UniqueName: \"kubernetes.io/projected/5a0eab0c-9c9b-4c99-9117-ed93ab036378-kube-api-access-kbvrr\") pod \"insights-runtime-extractor-6jl5b\" (UID: \"5a0eab0c-9c9b-4c99-9117-ed93ab036378\") " pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:11.060177 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:11.060146 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6jl5b" Apr 17 20:53:11.174332 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:11.174300 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6jl5b"] Apr 17 20:53:11.178026 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:11.178001 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0eab0c_9c9b_4c99_9117_ed93ab036378.slice/crio-3aa30f8e8e3122db04fe8338bf8cd071201fd46e71eb5e201ce1b01ccef9a94e WatchSource:0}: Error finding container 3aa30f8e8e3122db04fe8338bf8cd071201fd46e71eb5e201ce1b01ccef9a94e: Status 404 returned error can't find the container with id 3aa30f8e8e3122db04fe8338bf8cd071201fd46e71eb5e201ce1b01ccef9a94e Apr 17 20:53:11.514422 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:11.514334 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jl5b" event={"ID":"5a0eab0c-9c9b-4c99-9117-ed93ab036378","Type":"ContainerStarted","Data":"2ed1e78a5d2a849f01cc51af757d2c08ce9088feab4508f405a30b7882eb409a"} Apr 17 20:53:11.514422 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:11.514369 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jl5b" event={"ID":"5a0eab0c-9c9b-4c99-9117-ed93ab036378","Type":"ContainerStarted","Data":"3aa30f8e8e3122db04fe8338bf8cd071201fd46e71eb5e201ce1b01ccef9a94e"} Apr 17 20:53:11.515513 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:11.515494 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 20:53:12.519879 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:12.519841 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jl5b" event={"ID":"5a0eab0c-9c9b-4c99-9117-ed93ab036378","Type":"ContainerStarted","Data":"716af841b3fc3dafeebced365a43df48fcf913847cc32cf7ea9e63e9c120c5c3"} Apr 17 20:53:13.523957 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:13.523922 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6jl5b" event={"ID":"5a0eab0c-9c9b-4c99-9117-ed93ab036378","Type":"ContainerStarted","Data":"1b22373765d71c34531dc6ef6f310d1f470545051f13fce10d04064e8c7945a5"} Apr 17 20:53:13.543350 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:13.543284 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6jl5b" podStartSLOduration=1.512038112 podStartE2EDuration="3.54326977s" podCreationTimestamp="2026-04-17 20:53:10 +0000 UTC" firstStartedPulling="2026-04-17 20:53:11.233525673 +0000 UTC m=+135.700966587" lastFinishedPulling="2026-04-17 20:53:13.264757325 +0000 UTC m=+137.732198245" observedRunningTime="2026-04-17 20:53:13.542245543 +0000 UTC m=+138.009686471" watchObservedRunningTime="2026-04-17 20:53:13.54326977 +0000 UTC m=+138.010710703" Apr 17 20:53:15.693129 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:15.693086 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:53:15.693129 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:15.693137 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:53:15.693674 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:15.693577 2575 scope.go:117] "RemoveContainer" containerID="cd851ab049011ecb9c4e5c9a396e70f6d7e3da2092cb20c6ba4e71154b397812" Apr 17 20:53:15.693807 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:15.693784 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-v47zc_openshift-console-operator(d30ac7a2-edef-43e4-a645-fbf9445df632)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podUID="d30ac7a2-edef-43e4-a645-fbf9445df632" Apr 17 20:53:17.297292 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.297241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:17.297847 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.297825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/591063be-61cb-4346-b946-b6ad0d833153-service-ca-bundle\") pod \"router-default-75858d479b-s6j25\" (UID: \"591063be-61cb-4346-b946-b6ad0d833153\") " pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:17.398371 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.398328 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:53:17.400715 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.400689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/99cf7a02-6c85-4f4e-b81d-d984d3c34a8c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ktp5n\" (UID: \"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:53:17.593585 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.593493 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-vq7ww\"" Apr 17 20:53:17.602251 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.602215 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:17.696946 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.696917 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-pglw7\"" Apr 17 20:53:17.705082 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.705052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" Apr 17 20:53:17.717541 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.717510 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-75858d479b-s6j25"] Apr 17 20:53:17.721103 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:17.721078 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591063be_61cb_4346_b946_b6ad0d833153.slice/crio-fdc794670902f57ae6a2edaf0fc38a7ba4eaa28e29b6975c09667ee7ad04eb69 WatchSource:0}: Error finding container fdc794670902f57ae6a2edaf0fc38a7ba4eaa28e29b6975c09667ee7ad04eb69: Status 404 returned error can't find the container with id fdc794670902f57ae6a2edaf0fc38a7ba4eaa28e29b6975c09667ee7ad04eb69 Apr 17 20:53:17.831001 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:17.830966 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n"] Apr 17 20:53:17.835730 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:17.835703 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99cf7a02_6c85_4f4e_b81d_d984d3c34a8c.slice/crio-55c2f7962ecb78e940430e3a2a52d2a6cac0ff3c9e471d2248687db619cc4fc2 WatchSource:0}: Error finding container 55c2f7962ecb78e940430e3a2a52d2a6cac0ff3c9e471d2248687db619cc4fc2: Status 404 returned error can't find the container with id 55c2f7962ecb78e940430e3a2a52d2a6cac0ff3c9e471d2248687db619cc4fc2 Apr 17 20:53:18.536663 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:18.536629 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" event={"ID":"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c","Type":"ContainerStarted","Data":"55c2f7962ecb78e940430e3a2a52d2a6cac0ff3c9e471d2248687db619cc4fc2"} Apr 17 20:53:18.537953 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:18.537927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-75858d479b-s6j25" event={"ID":"591063be-61cb-4346-b946-b6ad0d833153","Type":"ContainerStarted","Data":"85a99d30628f295a7382852ecfcdbeb1f3bcc76f6af877212eecbfb1a4ac725f"} Apr 17 20:53:18.538049 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:18.537961 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-75858d479b-s6j25" event={"ID":"591063be-61cb-4346-b946-b6ad0d833153","Type":"ContainerStarted","Data":"fdc794670902f57ae6a2edaf0fc38a7ba4eaa28e29b6975c09667ee7ad04eb69"} Apr 17 20:53:18.556028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:18.555987 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-75858d479b-s6j25" podStartSLOduration=33.555972068 podStartE2EDuration="33.555972068s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:53:18.554414294 +0000 UTC m=+143.021855226" watchObservedRunningTime="2026-04-17 20:53:18.555972068 +0000 UTC m=+143.023413000" Apr 17 20:53:18.602383 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:18.602346 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:18.604905 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:18.604882 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:19.540581 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:19.540552 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:19.541908 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:19.541888 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-75858d479b-s6j25" Apr 17 20:53:20.543527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:20.543491 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" event={"ID":"99cf7a02-6c85-4f4e-b81d-d984d3c34a8c","Type":"ContainerStarted","Data":"23f47a3c381ec30a4bca81d1596bc842b060655c1e08bbd53299cd75ed4b76b1"} Apr 17 20:53:20.559038 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:20.558995 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ktp5n" podStartSLOduration=33.740128871 podStartE2EDuration="35.558982788s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:53:17.837554927 +0000 UTC m=+142.304995838" lastFinishedPulling="2026-04-17 20:53:19.656408829 +0000 UTC m=+144.123849755" observedRunningTime="2026-04-17 20:53:20.558745015 +0000 UTC m=+145.026185948" watchObservedRunningTime="2026-04-17 20:53:20.558982788 +0000 UTC m=+145.026423720" Apr 17 20:53:23.243685 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.243647 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-wffpx"] Apr 17 20:53:23.248191 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.248170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.250691 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.250669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 20:53:23.250788 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.250670 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 20:53:23.251621 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.251601 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 20:53:23.251721 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.251657 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5j95k\"" Apr 17 20:53:23.254471 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.254452 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-wffpx"] Apr 17 20:53:23.348827 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.348792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/daeab26f-6949-4dda-89cb-d8f7129704e2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.349028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.348837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b9mk\" (UniqueName: \"kubernetes.io/projected/daeab26f-6949-4dda-89cb-d8f7129704e2-kube-api-access-2b9mk\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.349028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.348952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/daeab26f-6949-4dda-89cb-d8f7129704e2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.349028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.349010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daeab26f-6949-4dda-89cb-d8f7129704e2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.450043 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.449989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/daeab26f-6949-4dda-89cb-d8f7129704e2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.450043 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.450051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b9mk\" (UniqueName: \"kubernetes.io/projected/daeab26f-6949-4dda-89cb-d8f7129704e2-kube-api-access-2b9mk\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.450354 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.450078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/daeab26f-6949-4dda-89cb-d8f7129704e2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.450354 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.450102 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daeab26f-6949-4dda-89cb-d8f7129704e2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.450809 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.450785 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/daeab26f-6949-4dda-89cb-d8f7129704e2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.452588 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.452563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/daeab26f-6949-4dda-89cb-d8f7129704e2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.452763 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.452743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/daeab26f-6949-4dda-89cb-d8f7129704e2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.458287 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.458265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b9mk\" (UniqueName: \"kubernetes.io/projected/daeab26f-6949-4dda-89cb-d8f7129704e2-kube-api-access-2b9mk\") pod \"prometheus-operator-5676c8c784-wffpx\" (UID: \"daeab26f-6949-4dda-89cb-d8f7129704e2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.558825 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.558795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" Apr 17 20:53:23.693756 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:23.693721 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-wffpx"] Apr 17 20:53:23.697028 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:23.697002 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaeab26f_6949_4dda_89cb_d8f7129704e2.slice/crio-95f5ead981ce4e69f8ff2107a1ace14baada30cbd409c29ca53d4a3fc3fdf8fa WatchSource:0}: Error finding container 95f5ead981ce4e69f8ff2107a1ace14baada30cbd409c29ca53d4a3fc3fdf8fa: Status 404 returned error can't find the container with id 95f5ead981ce4e69f8ff2107a1ace14baada30cbd409c29ca53d4a3fc3fdf8fa Apr 17 20:53:24.555508 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:24.555463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" event={"ID":"daeab26f-6949-4dda-89cb-d8f7129704e2","Type":"ContainerStarted","Data":"95f5ead981ce4e69f8ff2107a1ace14baada30cbd409c29ca53d4a3fc3fdf8fa"} Apr 17 20:53:25.559161 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:25.559123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" event={"ID":"daeab26f-6949-4dda-89cb-d8f7129704e2","Type":"ContainerStarted","Data":"62cab8e31676ec2067996004777db8c30b24097293abaf47981c79f14da27a2e"} Apr 17 20:53:25.559161 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:25.559167 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" event={"ID":"daeab26f-6949-4dda-89cb-d8f7129704e2","Type":"ContainerStarted","Data":"442f5a9463fb0c612773175424930801a3a54c45d6c4fad240b70414624f4f68"} Apr 17 20:53:25.576077 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:25.576030 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-wffpx" podStartSLOduration=1.433597357 podStartE2EDuration="2.576014841s" podCreationTimestamp="2026-04-17 20:53:23 +0000 UTC" firstStartedPulling="2026-04-17 20:53:23.698901062 +0000 UTC m=+148.166341974" lastFinishedPulling="2026-04-17 20:53:24.841318544 +0000 UTC m=+149.308759458" observedRunningTime="2026-04-17 20:53:25.574760071 +0000 UTC m=+150.042201003" watchObservedRunningTime="2026-04-17 20:53:25.576014841 +0000 UTC m=+150.043455774" Apr 17 20:53:27.599611 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.599581 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-qxw7b"] Apr 17 20:53:27.605138 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.605117 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.608409 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.608386 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 20:53:27.608848 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.608830 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hmlbz\"" Apr 17 20:53:27.608971 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.608892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 20:53:27.608971 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.608901 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 20:53:27.681729 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.681687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-textfile\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.681966 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.681740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-sys\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.681966 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.681792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-accelerators-collector-config\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.681966 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.681818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-tls\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.681966 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.681932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-wtmp\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.682163 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.681981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc03518e-8a07-44bc-af57-c83515d5fec6-metrics-client-ca\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.682163 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.682016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.682163 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.682041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp2wp\" (UniqueName: \"kubernetes.io/projected/fc03518e-8a07-44bc-af57-c83515d5fec6-kube-api-access-lp2wp\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.682163 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.682091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-root\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782719 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-textfile\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-sys\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-accelerators-collector-config\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-tls\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-wtmp\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782879 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc03518e-8a07-44bc-af57-c83515d5fec6-metrics-client-ca\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-sys\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.782929 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783301 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.782949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp2wp\" (UniqueName: \"kubernetes.io/projected/fc03518e-8a07-44bc-af57-c83515d5fec6-kube-api-access-lp2wp\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783301 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.783003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-root\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783301 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.783072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-root\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783301 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.783070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-textfile\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783301 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.783184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-wtmp\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783301 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:27.783297 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 20:53:27.783600 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:27.783355 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-tls podName:fc03518e-8a07-44bc-af57-c83515d5fec6 nodeName:}" failed. No retries permitted until 2026-04-17 20:53:28.283334182 +0000 UTC m=+152.750775099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-tls") pod "node-exporter-qxw7b" (UID: "fc03518e-8a07-44bc-af57-c83515d5fec6") : secret "node-exporter-tls" not found Apr 17 20:53:27.783796 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.783777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-accelerators-collector-config\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.783879 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.783855 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc03518e-8a07-44bc-af57-c83515d5fec6-metrics-client-ca\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.785713 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.785692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:27.790937 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:27.790920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp2wp\" (UniqueName: \"kubernetes.io/projected/fc03518e-8a07-44bc-af57-c83515d5fec6-kube-api-access-lp2wp\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:28.129774 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:28.129744 2575 scope.go:117] "RemoveContainer" containerID="cd851ab049011ecb9c4e5c9a396e70f6d7e3da2092cb20c6ba4e71154b397812" Apr 17 20:53:28.129937 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:28.129912 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-v47zc_openshift-console-operator(d30ac7a2-edef-43e4-a645-fbf9445df632)\"" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podUID="d30ac7a2-edef-43e4-a645-fbf9445df632" Apr 17 20:53:28.287347 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:28.287316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-tls\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:28.289599 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:28.289571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/fc03518e-8a07-44bc-af57-c83515d5fec6-node-exporter-tls\") pod \"node-exporter-qxw7b\" (UID: \"fc03518e-8a07-44bc-af57-c83515d5fec6\") " pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:28.514860 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:28.514781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-qxw7b" Apr 17 20:53:28.523967 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:28.523932 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc03518e_8a07_44bc_af57_c83515d5fec6.slice/crio-85b66c1a3bef68afc75e60bf88398d850413a21b9b4595ee2c85de795beeb45d WatchSource:0}: Error finding container 85b66c1a3bef68afc75e60bf88398d850413a21b9b4595ee2c85de795beeb45d: Status 404 returned error can't find the container with id 85b66c1a3bef68afc75e60bf88398d850413a21b9b4595ee2c85de795beeb45d Apr 17 20:53:28.570690 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:28.570656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxw7b" event={"ID":"fc03518e-8a07-44bc-af57-c83515d5fec6","Type":"ContainerStarted","Data":"85b66c1a3bef68afc75e60bf88398d850413a21b9b4595ee2c85de795beeb45d"} Apr 17 20:53:29.575197 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:29.575158 2575 generic.go:358] "Generic (PLEG): container finished" podID="fc03518e-8a07-44bc-af57-c83515d5fec6" containerID="9ea41fd1c3d6904875e0695c928dbd2c9c2d03bf6a79f60f71b1dce14bb608c4" exitCode=0 Apr 17 20:53:29.575579 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:29.575241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxw7b" event={"ID":"fc03518e-8a07-44bc-af57-c83515d5fec6","Type":"ContainerDied","Data":"9ea41fd1c3d6904875e0695c928dbd2c9c2d03bf6a79f60f71b1dce14bb608c4"} Apr 17 20:53:30.552689 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.552656 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-f9d48c748-tvrz7"] Apr 17 20:53:30.556649 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.556627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.559344 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.559319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 20:53:30.559464 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.559333 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 20:53:30.559531 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.559482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 20:53:30.560208 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.560179 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-jqmdp\"" Apr 17 20:53:30.560651 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.560630 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 20:53:30.561675 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.561649 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 20:53:30.561776 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.561700 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-2e02ktk6ubhk7\"" Apr 17 20:53:30.569943 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.569916 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f9d48c748-tvrz7"] Apr 17 20:53:30.582779 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.582751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxw7b" event={"ID":"fc03518e-8a07-44bc-af57-c83515d5fec6","Type":"ContainerStarted","Data":"9d7f0c51a1d658b5b7c7bf0716649dafba4dcc62efa37bf2ca2b41d9e53be7ef"} Apr 17 20:53:30.583191 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.582787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-qxw7b" event={"ID":"fc03518e-8a07-44bc-af57-c83515d5fec6","Type":"ContainerStarted","Data":"dd8336ad9c841bda87f25bc095aa7fcee84e5dbb192a5f9560544396736d63f8"} Apr 17 20:53:30.602494 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.602429 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-qxw7b" podStartSLOduration=2.936883503 podStartE2EDuration="3.60241398s" podCreationTimestamp="2026-04-17 20:53:27 +0000 UTC" firstStartedPulling="2026-04-17 20:53:28.526025283 +0000 UTC m=+152.993466211" lastFinishedPulling="2026-04-17 20:53:29.191555774 +0000 UTC m=+153.658996688" observedRunningTime="2026-04-17 20:53:30.600979003 +0000 UTC m=+155.068419933" watchObservedRunningTime="2026-04-17 20:53:30.60241398 +0000 UTC m=+155.069854913" Apr 17 20:53:30.603383 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf8c2\" (UniqueName: \"kubernetes.io/projected/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-kube-api-access-lf8c2\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603548 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603548 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-tls\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603548 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603470 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603548 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-metrics-client-ca\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603548 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603792 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-grpc-tls\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.603792 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.603767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704430 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-grpc-tls\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704430 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf8c2\" (UniqueName: \"kubernetes.io/projected/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-kube-api-access-lf8c2\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-tls\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-metrics-client-ca\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.704681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.704581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.705595 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.705554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-metrics-client-ca\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.707289 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.707264 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.707411 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.707273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-tls\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.707588 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.707561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.707588 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.707580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-grpc-tls\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.707798 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.707777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.707854 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.707782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.712715 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.712695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf8c2\" (UniqueName: \"kubernetes.io/projected/2e4efe81-da2d-4b12-9682-50c7aa8b69fd-kube-api-access-lf8c2\") pod \"thanos-querier-f9d48c748-tvrz7\" (UID: \"2e4efe81-da2d-4b12-9682-50c7aa8b69fd\") " pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.869674 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.869573 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:30.991573 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:30.991539 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-f9d48c748-tvrz7"] Apr 17 20:53:30.994921 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:30.994887 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e4efe81_da2d_4b12_9682_50c7aa8b69fd.slice/crio-678a4beb35f2f4f73a4f744e5762620523d89b40dbdc07dc3b539550c089c226 WatchSource:0}: Error finding container 678a4beb35f2f4f73a4f744e5762620523d89b40dbdc07dc3b539550c089c226: Status 404 returned error can't find the container with id 678a4beb35f2f4f73a4f744e5762620523d89b40dbdc07dc3b539550c089c226 Apr 17 20:53:31.587739 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.587694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"678a4beb35f2f4f73a4f744e5762620523d89b40dbdc07dc3b539550c089c226"} Apr 17 20:53:31.875264 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.875162 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-78d578744-sbljb"] Apr 17 20:53:31.878697 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.878673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.881360 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.881258 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 20:53:31.881360 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.881328 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 20:53:31.882527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.882356 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 20:53:31.882527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.882406 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-qj98t\"" Apr 17 20:53:31.882527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.882459 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-hpa4t31coe5u\"" Apr 17 20:53:31.882527 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.882406 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 20:53:31.886788 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.886767 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-78d578744-sbljb"] Apr 17 20:53:31.914042 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3356598d-23e5-49f0-bc1c-f41ca236c15b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.914182 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3356598d-23e5-49f0-bc1c-f41ca236c15b-audit-log\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.914182 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnpng\" (UniqueName: \"kubernetes.io/projected/3356598d-23e5-49f0-bc1c-f41ca236c15b-kube-api-access-vnpng\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.914182 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-client-ca-bundle\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.914324 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-secret-metrics-server-tls\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.914324 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-secret-metrics-server-client-certs\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.914324 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:31.914284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3356598d-23e5-49f0-bc1c-f41ca236c15b-metrics-server-audit-profiles\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:31.988352 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:31.988313 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-qbn97" podUID="65c9c4b6-99be-4373-933b-d44dfd308d32" Apr 17 20:53:31.997521 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:53:31.997484 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rxb59" podUID="b8b94abe-b5ba-4b0f-ae1f-63575ffbb062" Apr 17 20:53:32.014800 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.014767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3356598d-23e5-49f0-bc1c-f41ca236c15b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.014967 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.014815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3356598d-23e5-49f0-bc1c-f41ca236c15b-audit-log\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.014955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnpng\" (UniqueName: \"kubernetes.io/projected/3356598d-23e5-49f0-bc1c-f41ca236c15b-kube-api-access-vnpng\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.015010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-client-ca-bundle\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015128 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.015040 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-secret-metrics-server-tls\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015128 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.015098 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-secret-metrics-server-client-certs\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015259 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.015134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3356598d-23e5-49f0-bc1c-f41ca236c15b-metrics-server-audit-profiles\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015259 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.015196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3356598d-23e5-49f0-bc1c-f41ca236c15b-audit-log\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.015580 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.015557 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3356598d-23e5-49f0-bc1c-f41ca236c15b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.016246 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.016201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3356598d-23e5-49f0-bc1c-f41ca236c15b-metrics-server-audit-profiles\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.017757 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.017736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-secret-metrics-server-client-certs\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.017860 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.017835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-secret-metrics-server-tls\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.017860 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.017842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3356598d-23e5-49f0-bc1c-f41ca236c15b-client-ca-bundle\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.022625 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.022602 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnpng\" (UniqueName: \"kubernetes.io/projected/3356598d-23e5-49f0-bc1c-f41ca236c15b-kube-api-access-vnpng\") pod \"metrics-server-78d578744-sbljb\" (UID: \"3356598d-23e5-49f0-bc1c-f41ca236c15b\") " pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.191250 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.191147 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:32.591150 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.591090 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbn97" Apr 17 20:53:32.644602 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:32.644586 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-78d578744-sbljb"] Apr 17 20:53:32.646359 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:32.646332 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3356598d_23e5_49f0_bc1c_f41ca236c15b.slice/crio-fe865f76a565f508d4805caeb656b751fff465e59cb37eea35bd95325d673a26 WatchSource:0}: Error finding container fe865f76a565f508d4805caeb656b751fff465e59cb37eea35bd95325d673a26: Status 404 returned error can't find the container with id fe865f76a565f508d4805caeb656b751fff465e59cb37eea35bd95325d673a26 Apr 17 20:53:33.596603 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.596562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-78d578744-sbljb" event={"ID":"3356598d-23e5-49f0-bc1c-f41ca236c15b","Type":"ContainerStarted","Data":"fe865f76a565f508d4805caeb656b751fff465e59cb37eea35bd95325d673a26"} Apr 17 20:53:33.599014 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.598909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"1ad3ebafc34e1c154e877503e662f766864e676965f13dad55327a3acaafdca7"} Apr 17 20:53:33.599014 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.598947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"0ac352384e6293d73be20c8a1b8b30bcdff99ad86ee093602b30c1a5014eeee1"} Apr 17 20:53:33.599014 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.598962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"fe239b9210dc8964d7c78fe8886ca5429aca9b814d4d34d77a12a34e4eb92727"} Apr 17 20:53:33.599014 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.598975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"1708cfe8333b641f7bc050ce0d292a78767e9aa83d73e670829cb8c50a11c222"} Apr 17 20:53:33.800100 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.800061 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:53:33.804289 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.804266 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.806770 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.806928 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.807060 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.807377 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.807678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.807844 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.808433 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-75aadmgq6r7l8\"" Apr 17 20:53:33.809325 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.808746 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 20:53:33.810846 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.810826 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 20:53:33.811592 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.811573 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 20:53:33.811867 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.811582 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 20:53:33.812165 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.811616 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-bzr2f\"" Apr 17 20:53:33.812775 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.812756 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 20:53:33.816403 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.815258 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 20:53:33.817726 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.817705 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833610 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwf29\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-kube-api-access-nwf29\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833653 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833692 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.833772 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config-out\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833877 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-web-config\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.833976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.834395 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.834009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.934668 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.934668 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.934668 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.934912 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.934912 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.934912 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwf29\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-kube-api-access-nwf29\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935009 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935009 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934953 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935009 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.934988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935126 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935126 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935126 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config-out\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935126 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935373 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935373 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935373 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935192 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935373 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935215 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-web-config\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935373 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935693 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.935983 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.935958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.936640 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.936609 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.936738 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.936663 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.937620 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.937593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.938047 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.937757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.938047 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.937880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.938893 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.938540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.938893 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.938640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.939440 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.939412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.939546 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.939414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.940135 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.940108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config-out\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.940211 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.940109 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.940294 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.940263 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.940726 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.940706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-web-config\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.941046 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.941026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.941982 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.941958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:33.943137 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:33.943118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwf29\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-kube-api-access-nwf29\") pod \"prometheus-k8s-0\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:34.123607 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.123562 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:34.265531 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.265498 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:53:34.603979 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.603941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"24026fc659fd7e2e04a39c96f4cfe03db92bfd9482e701968b3ec53ab0721581"} Apr 17 20:53:34.604477 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.603986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" event={"ID":"2e4efe81-da2d-4b12-9682-50c7aa8b69fd","Type":"ContainerStarted","Data":"e91ca448c871aeca34d0342e586082934c1b7dd81bfd76fb47488458374e6010"} Apr 17 20:53:34.604477 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.604093 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:34.605289 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.605266 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-78d578744-sbljb" event={"ID":"3356598d-23e5-49f0-bc1c-f41ca236c15b","Type":"ContainerStarted","Data":"39fa296f0180b326f2facd0df0f016b7b84b3352b136abd81609d79c89c2e17a"} Apr 17 20:53:34.606284 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.606265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"e37a753d7d18d0f8011e2db26a3f38b3b2b5a3c5f54b919d9856ed62a4667efd"} Apr 17 20:53:34.623123 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.623070 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" podStartSLOduration=2.182038057 podStartE2EDuration="4.623052324s" podCreationTimestamp="2026-04-17 20:53:30 +0000 UTC" firstStartedPulling="2026-04-17 20:53:30.996753333 +0000 UTC m=+155.464194245" lastFinishedPulling="2026-04-17 20:53:33.437767594 +0000 UTC m=+157.905208512" observedRunningTime="2026-04-17 20:53:34.621581137 +0000 UTC m=+159.089022070" watchObservedRunningTime="2026-04-17 20:53:34.623052324 +0000 UTC m=+159.090493258" Apr 17 20:53:34.638692 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:34.638649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-78d578744-sbljb" podStartSLOduration=2.281935732 podStartE2EDuration="3.638631611s" podCreationTimestamp="2026-04-17 20:53:31 +0000 UTC" firstStartedPulling="2026-04-17 20:53:32.648188653 +0000 UTC m=+157.115629570" lastFinishedPulling="2026-04-17 20:53:34.004884524 +0000 UTC m=+158.472325449" observedRunningTime="2026-04-17 20:53:34.636147697 +0000 UTC m=+159.103588631" watchObservedRunningTime="2026-04-17 20:53:34.638631611 +0000 UTC m=+159.106072546" Apr 17 20:53:35.610339 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:35.610301 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" exitCode=0 Apr 17 20:53:35.610725 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:35.610394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} Apr 17 20:53:36.863098 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:36.863054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:53:36.863098 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:36.863103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:53:36.865836 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:36.865812 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/65c9c4b6-99be-4373-933b-d44dfd308d32-metrics-tls\") pod \"dns-default-qbn97\" (UID: \"65c9c4b6-99be-4373-933b-d44dfd308d32\") " pod="openshift-dns/dns-default-qbn97" Apr 17 20:53:36.865946 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:36.865919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b94abe-b5ba-4b0f-ae1f-63575ffbb062-cert\") pod \"ingress-canary-rxb59\" (UID: \"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062\") " pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:53:37.094658 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:37.094629 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tswpv\"" Apr 17 20:53:37.102873 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:37.102835 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qbn97" Apr 17 20:53:37.231940 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:37.231915 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qbn97"] Apr 17 20:53:38.110884 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:38.110841 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65c9c4b6_99be_4373_933b_d44dfd308d32.slice/crio-084c51788e1be9b6521fbc8ce81c7d589a1feb0953c2cfdbdb4d2b1821335901 WatchSource:0}: Error finding container 084c51788e1be9b6521fbc8ce81c7d589a1feb0953c2cfdbdb4d2b1821335901: Status 404 returned error can't find the container with id 084c51788e1be9b6521fbc8ce81c7d589a1feb0953c2cfdbdb4d2b1821335901 Apr 17 20:53:38.624127 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.624038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} Apr 17 20:53:38.624127 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.624081 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} Apr 17 20:53:38.624127 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.624096 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} Apr 17 20:53:38.624127 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.624108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} Apr 17 20:53:38.626495 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.626377 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} Apr 17 20:53:38.626495 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.626459 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerStarted","Data":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} Apr 17 20:53:38.626495 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.626475 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbn97" event={"ID":"65c9c4b6-99be-4373-933b-d44dfd308d32","Type":"ContainerStarted","Data":"084c51788e1be9b6521fbc8ce81c7d589a1feb0953c2cfdbdb4d2b1821335901"} Apr 17 20:53:38.654833 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:38.654654 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.7679506090000001 podStartE2EDuration="5.654615818s" podCreationTimestamp="2026-04-17 20:53:33 +0000 UTC" firstStartedPulling="2026-04-17 20:53:34.272681532 +0000 UTC m=+158.740122444" lastFinishedPulling="2026-04-17 20:53:38.159346727 +0000 UTC m=+162.626787653" observedRunningTime="2026-04-17 20:53:38.652814317 +0000 UTC m=+163.120255262" watchObservedRunningTime="2026-04-17 20:53:38.654615818 +0000 UTC m=+163.122056795" Apr 17 20:53:39.124508 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:39.124472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:53:39.633681 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:39.633648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbn97" event={"ID":"65c9c4b6-99be-4373-933b-d44dfd308d32","Type":"ContainerStarted","Data":"b3ed96a8231cd00d4a583b336ed69d16bee3e7901df38030162cb402c925f6fb"} Apr 17 20:53:40.616658 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:40.616630 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-f9d48c748-tvrz7" Apr 17 20:53:40.636286 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:40.636247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qbn97" event={"ID":"65c9c4b6-99be-4373-933b-d44dfd308d32","Type":"ContainerStarted","Data":"e25e37db1649dc36ea16f7adfc6e0cf9bc668a26a6d5cbc12a16607a68b7b292"} Apr 17 20:53:40.636415 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:40.636305 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-qbn97" Apr 17 20:53:40.653091 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:40.653038 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qbn97" podStartSLOduration=131.294600027 podStartE2EDuration="2m12.653019238s" podCreationTimestamp="2026-04-17 20:51:28 +0000 UTC" firstStartedPulling="2026-04-17 20:53:38.112709597 +0000 UTC m=+162.580150508" lastFinishedPulling="2026-04-17 20:53:39.471128804 +0000 UTC m=+163.938569719" observedRunningTime="2026-04-17 20:53:40.652730823 +0000 UTC m=+165.120171768" watchObservedRunningTime="2026-04-17 20:53:40.653019238 +0000 UTC m=+165.120460172" Apr 17 20:53:43.129310 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:43.129276 2575 scope.go:117] "RemoveContainer" containerID="cd851ab049011ecb9c4e5c9a396e70f6d7e3da2092cb20c6ba4e71154b397812" Apr 17 20:53:43.647995 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:43.647968 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 20:53:43.648149 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:43.648025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" event={"ID":"d30ac7a2-edef-43e4-a645-fbf9445df632","Type":"ContainerStarted","Data":"c408a8174b91f63067e1583a300f4dd72f8da782892f197f7b81d8f287c24ff6"} Apr 17 20:53:43.648366 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:43.648344 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:53:43.664980 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:43.664926 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" podStartSLOduration=56.829698125 podStartE2EDuration="58.664912452s" podCreationTimestamp="2026-04-17 20:52:45 +0000 UTC" firstStartedPulling="2026-04-17 20:52:45.832316175 +0000 UTC m=+110.299757087" lastFinishedPulling="2026-04-17 20:52:47.667530491 +0000 UTC m=+112.134971414" observedRunningTime="2026-04-17 20:53:43.664019548 +0000 UTC m=+168.131460481" watchObservedRunningTime="2026-04-17 20:53:43.664912452 +0000 UTC m=+168.132353396" Apr 17 20:53:43.933928 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:43.933852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-v47zc" Apr 17 20:53:44.129687 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:44.129656 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:53:44.132321 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:44.132300 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jvfcm\"" Apr 17 20:53:44.140275 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:44.140259 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rxb59" Apr 17 20:53:44.258422 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:44.258398 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rxb59"] Apr 17 20:53:44.260935 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:53:44.260908 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b94abe_b5ba_4b0f_ae1f_63575ffbb062.slice/crio-4cff0e64a3b57f25cbbfce945b611ac44d8ef0fca1836bcd3e2a7f5f466072ea WatchSource:0}: Error finding container 4cff0e64a3b57f25cbbfce945b611ac44d8ef0fca1836bcd3e2a7f5f466072ea: Status 404 returned error can't find the container with id 4cff0e64a3b57f25cbbfce945b611ac44d8ef0fca1836bcd3e2a7f5f466072ea Apr 17 20:53:44.652767 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:44.652730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rxb59" event={"ID":"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062","Type":"ContainerStarted","Data":"4cff0e64a3b57f25cbbfce945b611ac44d8ef0fca1836bcd3e2a7f5f466072ea"} Apr 17 20:53:46.660479 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:46.660443 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rxb59" event={"ID":"b8b94abe-b5ba-4b0f-ae1f-63575ffbb062","Type":"ContainerStarted","Data":"24189e0ba6fb7e9359b048eb4472b6e86ae53d56ddf6b2a81cf3089ded2ccdf7"} Apr 17 20:53:46.674756 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:46.674710 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rxb59" podStartSLOduration=137.156104195 podStartE2EDuration="2m18.674694831s" podCreationTimestamp="2026-04-17 20:51:28 +0000 UTC" firstStartedPulling="2026-04-17 20:53:44.262819084 +0000 UTC m=+168.730259995" lastFinishedPulling="2026-04-17 20:53:45.781409717 +0000 UTC m=+170.248850631" observedRunningTime="2026-04-17 20:53:46.673976582 +0000 UTC m=+171.141417515" watchObservedRunningTime="2026-04-17 20:53:46.674694831 +0000 UTC m=+171.142135763" Apr 17 20:53:50.641953 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:50.641924 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qbn97" Apr 17 20:53:52.192315 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:52.192275 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:53:52.192688 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:53:52.192362 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:54:00.906988 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:00.906952 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-75858d479b-s6j25_591063be-61cb-4346-b946-b6ad0d833153/router/0.log" Apr 17 20:54:00.917213 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:00.917181 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rxb59_b8b94abe-b5ba-4b0f-ae1f-63575ffbb062/serve-healthcheck-canary/0.log" Apr 17 20:54:12.197622 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:12.197587 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:54:12.202951 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:12.202921 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-78d578744-sbljb" Apr 17 20:54:34.124770 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:34.124734 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:34.140392 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:34.140368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:34.822411 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:34.822381 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:52.147375 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.147339 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:54:52.148653 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.148590 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="prometheus" containerID="cri-o://cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" gracePeriod=600 Apr 17 20:54:52.149250 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.148830 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="thanos-sidecar" containerID="cri-o://17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" gracePeriod=600 Apr 17 20:54:52.149356 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.148879 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-web" containerID="cri-o://4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" gracePeriod=600 Apr 17 20:54:52.149356 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.149010 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-thanos" containerID="cri-o://ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" gracePeriod=600 Apr 17 20:54:52.149467 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.149035 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="config-reloader" containerID="cri-o://b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" gracePeriod=600 Apr 17 20:54:52.149467 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.149095 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy" containerID="cri-o://86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" gracePeriod=600 Apr 17 20:54:52.393588 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.393561 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:52.509398 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509305 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-grpc-tls\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509398 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509348 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-kube-rbac-proxy\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509398 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509366 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-rulefiles-0\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509398 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509399 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-tls\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509432 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-metrics-client-ca\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509454 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-serving-certs-ca-bundle\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509471 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config-out\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509493 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-db\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509522 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509545 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-thanos-prometheus-http-client-file\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509578 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-metrics-client-certs\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-kubelet-serving-ca-bundle\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509648 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwf29\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-kube-api-access-nwf29\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509680 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.509722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509713 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-web-config\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509742 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-tls-assets\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509775 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509833 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-trusted-ca-bundle\") pod \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\" (UID: \"8e4826ba-b676-4a81-a457-d28dd8eee1a3\") " Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509850 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.509861 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.510072 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-metrics-client-ca\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.510091 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.510267 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.510114 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:52.510685 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.510398 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:52.510685 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.510593 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 20:54:52.512421 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.512389 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:54:52.512547 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.512444 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.513039 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.513007 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-kube-api-access-nwf29" (OuterVolumeSpecName: "kube-api-access-nwf29") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "kube-api-access-nwf29". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:52.513148 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.513034 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.513356 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.513314 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.513550 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.513529 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config-out" (OuterVolumeSpecName: "config-out") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 20:54:52.513550 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.513535 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.513738 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.513620 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.514629 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.514608 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.514719 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.514708 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.515304 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.515278 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 20:54:52.515472 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.515453 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config" (OuterVolumeSpecName: "config") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.524133 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.524108 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-web-config" (OuterVolumeSpecName: "web-config") pod "8e4826ba-b676-4a81-a457-d28dd8eee1a3" (UID: "8e4826ba-b676-4a81-a457-d28dd8eee1a3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 20:54:52.610743 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610709 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-metrics-client-certs\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610743 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610738 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610743 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610749 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nwf29\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-kube-api-access-nwf29\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610759 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610770 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-web-config\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610779 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e4826ba-b676-4a81-a457-d28dd8eee1a3-tls-assets\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610788 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610798 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610807 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-grpc-tls\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610817 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-kube-rbac-proxy\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610825 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610834 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610843 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-config-out\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610851 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8e4826ba-b676-4a81-a457-d28dd8eee1a3-prometheus-k8s-db\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610862 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.610972 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.610870 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8e4826ba-b676-4a81-a457-d28dd8eee1a3-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 20:54:52.866281 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866247 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" exitCode=0 Apr 17 20:54:52.866281 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866278 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" exitCode=0 Apr 17 20:54:52.866281 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866288 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" exitCode=0 Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866296 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" exitCode=0 Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866303 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" exitCode=0 Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866310 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" exitCode=0 Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866312 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866341 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866361 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866371 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866381 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866392 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8e4826ba-b676-4a81-a457-d28dd8eee1a3","Type":"ContainerDied","Data":"e37a753d7d18d0f8011e2db26a3f38b3b2b5a3c5f54b919d9856ed62a4667efd"} Apr 17 20:54:52.866521 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.866412 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.874575 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.874554 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.881061 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.881048 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.887085 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.887071 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.890675 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.890610 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:54:52.894309 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.894241 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:54:52.894368 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.894328 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.900722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.900705 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.907141 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.907122 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.913252 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.913235 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.913505 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.913484 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.913550 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.913513 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} err="failed to get container status \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" Apr 17 20:54:52.913550 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.913544 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.913751 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.913737 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.913793 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.913761 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} err="failed to get container status \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" Apr 17 20:54:52.913793 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.913777 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.914022 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.914003 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.914061 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914030 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} err="failed to get container status \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" Apr 17 20:54:52.914061 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914047 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.914309 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.914288 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.914410 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914314 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} err="failed to get container status \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" Apr 17 20:54:52.914410 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914334 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.914561 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.914539 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.914605 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914568 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} err="failed to get container status \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" Apr 17 20:54:52.914605 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914588 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.914923 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.914837 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.914923 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914863 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} err="failed to get container status \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" Apr 17 20:54:52.914923 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.914883 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.915118 ip-10-0-132-12 kubenswrapper[2575]: E0417 20:54:52.915100 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.915155 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915121 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} err="failed to get container status \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" Apr 17 20:54:52.915155 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915133 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.915342 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915326 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} err="failed to get container status \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" Apr 17 20:54:52.915342 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915341 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.915544 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915527 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} err="failed to get container status \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" Apr 17 20:54:52.915587 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915547 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.915733 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915718 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} err="failed to get container status \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" Apr 17 20:54:52.915776 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915733 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.915953 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915936 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} err="failed to get container status \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" Apr 17 20:54:52.915953 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.915952 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.916112 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916096 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} err="failed to get container status \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" Apr 17 20:54:52.916165 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916113 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.916339 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916321 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} err="failed to get container status \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" Apr 17 20:54:52.916339 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916337 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.916545 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916530 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} err="failed to get container status \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" Apr 17 20:54:52.916583 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916546 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.916751 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916734 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} err="failed to get container status \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" Apr 17 20:54:52.916787 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916752 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.916971 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916954 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} err="failed to get container status \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" Apr 17 20:54:52.917021 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.916971 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.917153 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917138 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} err="failed to get container status \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" Apr 17 20:54:52.917189 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917153 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.917396 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917380 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} err="failed to get container status \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" Apr 17 20:54:52.917443 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917397 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.917604 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917590 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} err="failed to get container status \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" Apr 17 20:54:52.917660 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917606 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.917800 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917784 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} err="failed to get container status \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" Apr 17 20:54:52.917800 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917798 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.917968 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917955 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} err="failed to get container status \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" Apr 17 20:54:52.918022 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.917968 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.918174 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918160 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} err="failed to get container status \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" Apr 17 20:54:52.918174 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918173 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.918381 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918364 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} err="failed to get container status \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" Apr 17 20:54:52.918381 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918381 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.918589 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918568 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} err="failed to get container status \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" Apr 17 20:54:52.918635 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918592 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.918765 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918750 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} err="failed to get container status \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" Apr 17 20:54:52.918808 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918764 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.918937 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918921 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} err="failed to get container status \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" Apr 17 20:54:52.918979 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.918938 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.919118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919098 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} err="failed to get container status \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" Apr 17 20:54:52.919189 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919119 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.919347 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919330 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} err="failed to get container status \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" Apr 17 20:54:52.919417 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919348 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.919557 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919541 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} err="failed to get container status \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" Apr 17 20:54:52.919601 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919558 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.919746 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919729 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} err="failed to get container status \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" Apr 17 20:54:52.919814 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919747 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.919932 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919919 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} err="failed to get container status \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" Apr 17 20:54:52.919932 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.919932 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.920112 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920094 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} err="failed to get container status \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" Apr 17 20:54:52.920176 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920113 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.920368 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920346 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} err="failed to get container status \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" Apr 17 20:54:52.920418 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920370 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.920560 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920545 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} err="failed to get container status \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" Apr 17 20:54:52.920598 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920560 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.920756 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920738 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} err="failed to get container status \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" Apr 17 20:54:52.920815 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920758 2575 scope.go:117] "RemoveContainer" containerID="ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609" Apr 17 20:54:52.920944 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920928 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609"} err="failed to get container status \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": rpc error: code = NotFound desc = could not find container \"ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609\": container with ID starting with ea2941bf7394c150565605435b044cc6b988f6afa1f414baddc735efa5b12609 not found: ID does not exist" Apr 17 20:54:52.920989 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.920944 2575 scope.go:117] "RemoveContainer" containerID="86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de" Apr 17 20:54:52.921085 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921071 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de"} err="failed to get container status \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": rpc error: code = NotFound desc = could not find container \"86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de\": container with ID starting with 86ec171fb110a6642beb8e5e137c2f2728efa6a47a12e90b7768a7de679904de not found: ID does not exist" Apr 17 20:54:52.921131 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921086 2575 scope.go:117] "RemoveContainer" containerID="4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53" Apr 17 20:54:52.921258 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921244 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53"} err="failed to get container status \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": rpc error: code = NotFound desc = could not find container \"4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53\": container with ID starting with 4642f33bedec9a8237faafa6d33569bc8723a865554611ae0b071acc47a7dc53 not found: ID does not exist" Apr 17 20:54:52.921307 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921257 2575 scope.go:117] "RemoveContainer" containerID="17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef" Apr 17 20:54:52.921425 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921409 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef"} err="failed to get container status \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": rpc error: code = NotFound desc = could not find container \"17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef\": container with ID starting with 17f635fe5f5ba48834578d8feb12161c8c3250651d26da210137dd4cc609a1ef not found: ID does not exist" Apr 17 20:54:52.921628 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921426 2575 scope.go:117] "RemoveContainer" containerID="b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929" Apr 17 20:54:52.921674 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921652 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929"} err="failed to get container status \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": rpc error: code = NotFound desc = could not find container \"b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929\": container with ID starting with b33fce476f30e56ab65104317b760f0c7c0ba0cc65d5ef7e7637ea1a88294929 not found: ID does not exist" Apr 17 20:54:52.921738 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921673 2575 scope.go:117] "RemoveContainer" containerID="cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177" Apr 17 20:54:52.921935 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921914 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177"} err="failed to get container status \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": rpc error: code = NotFound desc = could not find container \"cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177\": container with ID starting with cad7f693a146ae1032ea50d5060ff09042e062f9c7f82c6084418bafc9110177 not found: ID does not exist" Apr 17 20:54:52.921935 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.921935 2575 scope.go:117] "RemoveContainer" containerID="817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947" Apr 17 20:54:52.922185 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.922159 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947"} err="failed to get container status \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": rpc error: code = NotFound desc = could not find container \"817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947\": container with ID starting with 817e2f5f513f207f73c186a26441f37d6d632af88178877108fa1be24d993947 not found: ID does not exist" Apr 17 20:54:52.923715 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.923695 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:54:52.924082 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924066 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-thanos" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924085 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-thanos" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924098 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-web" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924106 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-web" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924125 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924134 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924144 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="prometheus" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924152 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="prometheus" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924160 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="thanos-sidecar" Apr 17 20:54:52.924168 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924168 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="thanos-sidecar" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924181 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="config-reloader" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924189 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="config-reloader" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924200 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="init-config-reloader" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924209 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="init-config-reloader" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924285 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-web" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924297 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="config-reloader" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924309 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="thanos-sidecar" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924319 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924328 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="kube-rbac-proxy-thanos" Apr 17 20:54:52.924528 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.924338 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" containerName="prometheus" Apr 17 20:54:52.929698 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.929680 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:52.932974 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.932953 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 20:54:52.933127 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.932994 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 20:54:52.933127 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933026 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-bzr2f\"" Apr 17 20:54:52.933534 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933519 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 20:54:52.933613 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933598 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 20:54:52.933662 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 20:54:52.933815 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 20:54:52.933884 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933849 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-75aadmgq6r7l8\"" Apr 17 20:54:52.933938 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 20:54:52.933990 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.933941 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 20:54:52.934038 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.934004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 20:54:52.934357 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.934343 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 20:54:52.936599 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.936580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 20:54:52.939082 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.939064 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 20:54:52.946115 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:52.946098 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:54:53.014094 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3b1d1ae5-4f52-46c2-ae87-f337988640e3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014094 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014133 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3b1d1ae5-4f52-46c2-ae87-f337988640e3-config-out\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014159 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014199 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014310 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-web-config\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014390 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-config\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014480 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014506 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h9lt\" (UniqueName: \"kubernetes.io/projected/3b1d1ae5-4f52-46c2-ae87-f337988640e3-kube-api-access-9h9lt\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.014701 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.014606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.115750 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.115719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-web-config\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.115750 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.115753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.115937 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.115772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.115937 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.115790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-config\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.117385 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.116360 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.117385 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.116435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.117385 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.116497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.117385 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.116547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.117385 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.117104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119319 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119292 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-config\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119505 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-web-config\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119505 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.116586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h9lt\" (UniqueName: \"kubernetes.io/projected/3b1d1ae5-4f52-46c2-ae87-f337988640e3-kube-api-access-9h9lt\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119505 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3b1d1ae5-4f52-46c2-ae87-f337988640e3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3b1d1ae5-4f52-46c2-ae87-f337988640e3-config-out\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.119754 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120291 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119763 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120291 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120291 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.119870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120291 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.120147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120530 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.120293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120637 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.120614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120881 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.120847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.120944 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.120847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.122241 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/3b1d1ae5-4f52-46c2-ae87-f337988640e3-config-out\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.122952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.123179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.123326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/3b1d1ae5-4f52-46c2-ae87-f337988640e3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.123451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.123568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/3b1d1ae5-4f52-46c2-ae87-f337988640e3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.124144 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.123777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/3b1d1ae5-4f52-46c2-ae87-f337988640e3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.126877 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.126857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h9lt\" (UniqueName: \"kubernetes.io/projected/3b1d1ae5-4f52-46c2-ae87-f337988640e3-kube-api-access-9h9lt\") pod \"prometheus-k8s-0\" (UID: \"3b1d1ae5-4f52-46c2-ae87-f337988640e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.239714 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.239677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:54:53.363888 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.363858 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 20:54:53.366917 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:54:53.366891 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1d1ae5_4f52_46c2_ae87_f337988640e3.slice/crio-8c2160267c70e27c851748d4f0d771fd7b348924e1c9d23491f0d0406198414c WatchSource:0}: Error finding container 8c2160267c70e27c851748d4f0d771fd7b348924e1c9d23491f0d0406198414c: Status 404 returned error can't find the container with id 8c2160267c70e27c851748d4f0d771fd7b348924e1c9d23491f0d0406198414c Apr 17 20:54:53.872202 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.872170 2575 generic.go:358] "Generic (PLEG): container finished" podID="3b1d1ae5-4f52-46c2-ae87-f337988640e3" containerID="c075719c6002f74abf4a8da8f3c29233a2099425b4392305483746bd24c10db4" exitCode=0 Apr 17 20:54:53.872377 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.872219 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerDied","Data":"c075719c6002f74abf4a8da8f3c29233a2099425b4392305483746bd24c10db4"} Apr 17 20:54:53.872377 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:53.872260 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"8c2160267c70e27c851748d4f0d771fd7b348924e1c9d23491f0d0406198414c"} Apr 17 20:54:54.134610 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.134580 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4826ba-b676-4a81-a457-d28dd8eee1a3" path="/var/lib/kubelet/pods/8e4826ba-b676-4a81-a457-d28dd8eee1a3/volumes" Apr 17 20:54:54.879679 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.879643 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"cdbdfaedf057561e3ad558e8675959dbda05b493c50e6d0ba069a7e661d234cc"} Apr 17 20:54:54.879679 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.879679 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"9f683d75f660e280cd4c2bfb3354d34f13a2973143b87d1b1251a92b55c8c365"} Apr 17 20:54:54.880118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.879690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"f9586c375ed81854f182143afa6166a8a9e2fa3f59e7e875a6c9d7fa87fca794"} Apr 17 20:54:54.880118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.879701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"e483713d5873f800f7a33b5d69a53dd2cc96b688fd7641ecc580d420b0eebd9e"} Apr 17 20:54:54.880118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.879709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"866a3993c90419c1be88819ce30d53fd0c094f963f9b39a054ba1551a89acff4"} Apr 17 20:54:54.880118 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.879718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"3b1d1ae5-4f52-46c2-ae87-f337988640e3","Type":"ContainerStarted","Data":"1d7b56b9cd8e9efa6ef78e286b8a881a47fdb1dae1a5a4e5f52c0f2907dbccab"} Apr 17 20:54:54.907192 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:54.907124 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.907103779 podStartE2EDuration="2.907103779s" podCreationTimestamp="2026-04-17 20:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 20:54:54.905143607 +0000 UTC m=+239.372584577" watchObservedRunningTime="2026-04-17 20:54:54.907103779 +0000 UTC m=+239.374544714" Apr 17 20:54:58.240408 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:54:58.240374 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:55:53.240906 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:53.240857 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:55:53.255946 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:53.255922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:55:54.071970 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:54.071944 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 20:55:55.975773 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:55.975743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 20:55:55.976272 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:55.975969 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 20:55:55.983529 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:55.983504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 20:55:55.984028 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:55.983995 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 20:55:55.988073 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:55:55.988057 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 20:56:54.598218 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.598186 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-z7q6j"] Apr 17 20:56:54.601398 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.601376 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.603901 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.603879 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 20:56:54.606615 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.606595 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z7q6j"] Apr 17 20:56:54.721110 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.721070 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-kubelet-config\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.721320 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.721186 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-dbus\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.721320 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.721245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-original-pull-secret\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.821996 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.821958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-original-pull-secret\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.822160 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.822010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-kubelet-config\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.822160 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.822054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-dbus\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.822255 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.822161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-kubelet-config\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.822300 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.822284 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-dbus\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.824357 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.824336 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c6a655f1-43b7-4fec-a1cc-63ab69f29b85-original-pull-secret\") pod \"global-pull-secret-syncer-z7q6j\" (UID: \"c6a655f1-43b7-4fec-a1cc-63ab69f29b85\") " pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:54.911674 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:54.911584 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-z7q6j" Apr 17 20:56:55.025447 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:55.025323 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-z7q6j"] Apr 17 20:56:55.028104 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:56:55.028075 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6a655f1_43b7_4fec_a1cc_63ab69f29b85.slice/crio-822fa66c7785decebbfdde5c2012e2ccccee327d88f36499f35b81e6b4c5498e WatchSource:0}: Error finding container 822fa66c7785decebbfdde5c2012e2ccccee327d88f36499f35b81e6b4c5498e: Status 404 returned error can't find the container with id 822fa66c7785decebbfdde5c2012e2ccccee327d88f36499f35b81e6b4c5498e Apr 17 20:56:55.029770 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:55.029754 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 20:56:55.226841 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:55.226759 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z7q6j" event={"ID":"c6a655f1-43b7-4fec-a1cc-63ab69f29b85","Type":"ContainerStarted","Data":"822fa66c7785decebbfdde5c2012e2ccccee327d88f36499f35b81e6b4c5498e"} Apr 17 20:56:59.244647 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:59.244614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-z7q6j" event={"ID":"c6a655f1-43b7-4fec-a1cc-63ab69f29b85","Type":"ContainerStarted","Data":"c42827b1762a24eb6e556eb2089357c50074e3744af5d5528d7d87fadc853c2a"} Apr 17 20:56:59.263056 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:56:59.263008 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-z7q6j" podStartSLOduration=1.67743357 podStartE2EDuration="5.262992554s" podCreationTimestamp="2026-04-17 20:56:54 +0000 UTC" firstStartedPulling="2026-04-17 20:56:55.0298788 +0000 UTC m=+359.497319712" lastFinishedPulling="2026-04-17 20:56:58.615437775 +0000 UTC m=+363.082878696" observedRunningTime="2026-04-17 20:56:59.261915436 +0000 UTC m=+363.729356371" watchObservedRunningTime="2026-04-17 20:56:59.262992554 +0000 UTC m=+363.730433487" Apr 17 20:58:04.762101 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.762020 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x"] Apr 17 20:58:04.764745 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.764728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.767207 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.767182 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 20:58:04.767433 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.767419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 20:58:04.768246 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.768217 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 20:58:04.768314 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.768298 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 20:58:04.768367 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.768323 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 20:58:04.768427 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.768381 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-rkdwc\"" Apr 17 20:58:04.773303 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.773281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x"] Apr 17 20:58:04.801206 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.801179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/145770d6-42bc-4dfd-8af6-3775582eb974-metrics-cert\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.801206 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.801206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcz8b\" (UniqueName: \"kubernetes.io/projected/145770d6-42bc-4dfd-8af6-3775582eb974-kube-api-access-tcz8b\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.801407 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.801291 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/145770d6-42bc-4dfd-8af6-3775582eb974-manager-config\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.801407 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.801306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/145770d6-42bc-4dfd-8af6-3775582eb974-cert\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.902586 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.902550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/145770d6-42bc-4dfd-8af6-3775582eb974-metrics-cert\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.902586 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.902590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcz8b\" (UniqueName: \"kubernetes.io/projected/145770d6-42bc-4dfd-8af6-3775582eb974-kube-api-access-tcz8b\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.902926 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.902649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/145770d6-42bc-4dfd-8af6-3775582eb974-manager-config\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.902926 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.902671 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/145770d6-42bc-4dfd-8af6-3775582eb974-cert\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.903265 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.903218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/145770d6-42bc-4dfd-8af6-3775582eb974-manager-config\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.905135 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.905104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/145770d6-42bc-4dfd-8af6-3775582eb974-metrics-cert\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.905253 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.905104 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/145770d6-42bc-4dfd-8af6-3775582eb974-cert\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:04.910612 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:04.910591 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcz8b\" (UniqueName: \"kubernetes.io/projected/145770d6-42bc-4dfd-8af6-3775582eb974-kube-api-access-tcz8b\") pod \"lws-controller-manager-7bd8bcccff-m5r5x\" (UID: \"145770d6-42bc-4dfd-8af6-3775582eb974\") " pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:05.075232 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:05.075192 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:05.195558 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:05.195534 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x"] Apr 17 20:58:05.197951 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:58:05.197921 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod145770d6_42bc_4dfd_8af6_3775582eb974.slice/crio-310947a1cea8cbb348266f03888ed53eed214935aca8b11b3ad604948177cf1f WatchSource:0}: Error finding container 310947a1cea8cbb348266f03888ed53eed214935aca8b11b3ad604948177cf1f: Status 404 returned error can't find the container with id 310947a1cea8cbb348266f03888ed53eed214935aca8b11b3ad604948177cf1f Apr 17 20:58:05.427377 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:05.427290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" event={"ID":"145770d6-42bc-4dfd-8af6-3775582eb974","Type":"ContainerStarted","Data":"310947a1cea8cbb348266f03888ed53eed214935aca8b11b3ad604948177cf1f"} Apr 17 20:58:09.441034 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.440977 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" event={"ID":"145770d6-42bc-4dfd-8af6-3775582eb974","Type":"ContainerStarted","Data":"3e7c8d9103ccd910cb8bdbcf1cdb2c2f6c40916669bdeede6266e0395951ff0c"} Apr 17 20:58:09.441662 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.441617 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:09.462276 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.462158 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" podStartSLOduration=1.466000005 podStartE2EDuration="5.462141246s" podCreationTimestamp="2026-04-17 20:58:04 +0000 UTC" firstStartedPulling="2026-04-17 20:58:05.199849623 +0000 UTC m=+429.667290534" lastFinishedPulling="2026-04-17 20:58:09.195990852 +0000 UTC m=+433.663431775" observedRunningTime="2026-04-17 20:58:09.46072131 +0000 UTC m=+433.928162244" watchObservedRunningTime="2026-04-17 20:58:09.462141246 +0000 UTC m=+433.929582179" Apr 17 20:58:09.752361 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.752286 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq"] Apr 17 20:58:09.755526 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.755507 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.758042 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.758018 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 20:58:09.759140 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.759123 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 20:58:09.759255 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.759189 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 20:58:09.759322 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.759254 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jpcvb\"" Apr 17 20:58:09.759422 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.759407 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 20:58:09.766826 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.766805 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq"] Apr 17 20:58:09.839796 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.839762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec6b4346-066b-4f55-a073-e6797324b990-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.839796 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.839803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmg8\" (UniqueName: \"kubernetes.io/projected/ec6b4346-066b-4f55-a073-e6797324b990-kube-api-access-bmmg8\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.839989 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.839821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec6b4346-066b-4f55-a073-e6797324b990-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.940217 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.940188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec6b4346-066b-4f55-a073-e6797324b990-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.940361 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.940252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmg8\" (UniqueName: \"kubernetes.io/projected/ec6b4346-066b-4f55-a073-e6797324b990-kube-api-access-bmmg8\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.940361 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.940287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec6b4346-066b-4f55-a073-e6797324b990-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.942671 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.942650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec6b4346-066b-4f55-a073-e6797324b990-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.942771 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.942670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec6b4346-066b-4f55-a073-e6797324b990-webhook-cert\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:09.951715 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:09.951692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmg8\" (UniqueName: \"kubernetes.io/projected/ec6b4346-066b-4f55-a073-e6797324b990-kube-api-access-bmmg8\") pod \"opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq\" (UID: \"ec6b4346-066b-4f55-a073-e6797324b990\") " pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:10.067002 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:10.066968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:10.195129 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:10.194128 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq"] Apr 17 20:58:10.448037 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:10.447956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" event={"ID":"ec6b4346-066b-4f55-a073-e6797324b990","Type":"ContainerStarted","Data":"aa30a6dbf7952d7051291c067905d99eb9f5371268447a577e9fc689ef13d734"} Apr 17 20:58:13.461059 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:13.461015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" event={"ID":"ec6b4346-066b-4f55-a073-e6797324b990","Type":"ContainerStarted","Data":"10fcb11144a242fd6a63e79d6f3de95ca594df3355c124d47ba07c95819f9823"} Apr 17 20:58:13.461542 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:13.461140 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:13.484674 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:13.484634 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" podStartSLOduration=1.970337242 podStartE2EDuration="4.484618067s" podCreationTimestamp="2026-04-17 20:58:09 +0000 UTC" firstStartedPulling="2026-04-17 20:58:10.200866607 +0000 UTC m=+434.668307523" lastFinishedPulling="2026-04-17 20:58:12.715147426 +0000 UTC m=+437.182588348" observedRunningTime="2026-04-17 20:58:13.48444163 +0000 UTC m=+437.951882563" watchObservedRunningTime="2026-04-17 20:58:13.484618067 +0000 UTC m=+437.952059001" Apr 17 20:58:21.455281 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:21.455252 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7bd8bcccff-m5r5x" Apr 17 20:58:24.466814 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:24.466785 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq" Apr 17 20:58:27.265916 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.265882 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq"] Apr 17 20:58:27.269402 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.269380 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.272749 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.272724 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 20:58:27.272882 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.272724 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 20:58:27.272882 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.272727 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-skxwg\"" Apr 17 20:58:27.278768 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.278727 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq"] Apr 17 20:58:27.385212 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.385176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/430f1f78-633c-4f42-8a49-ab7ede4ff22e-tls-certs\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.385412 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.385295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/430f1f78-633c-4f42-8a49-ab7ede4ff22e-tmp\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.385412 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.385323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjs6\" (UniqueName: \"kubernetes.io/projected/430f1f78-633c-4f42-8a49-ab7ede4ff22e-kube-api-access-gvjs6\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.485722 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.485684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/430f1f78-633c-4f42-8a49-ab7ede4ff22e-tls-certs\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.485866 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.485749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/430f1f78-633c-4f42-8a49-ab7ede4ff22e-tmp\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.485866 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.485772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjs6\" (UniqueName: \"kubernetes.io/projected/430f1f78-633c-4f42-8a49-ab7ede4ff22e-kube-api-access-gvjs6\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.488041 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.488015 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/430f1f78-633c-4f42-8a49-ab7ede4ff22e-tmp\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.488134 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.488117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/430f1f78-633c-4f42-8a49-ab7ede4ff22e-tls-certs\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.494017 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.493998 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjs6\" (UniqueName: \"kubernetes.io/projected/430f1f78-633c-4f42-8a49-ab7ede4ff22e-kube-api-access-gvjs6\") pod \"kube-auth-proxy-666889b9b6-xn2mq\" (UID: \"430f1f78-633c-4f42-8a49-ab7ede4ff22e\") " pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.580083 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.580051 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" Apr 17 20:58:27.903236 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:27.903089 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq"] Apr 17 20:58:27.905902 ip-10-0-132-12 kubenswrapper[2575]: W0417 20:58:27.905874 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430f1f78_633c_4f42_8a49_ab7ede4ff22e.slice/crio-b5676cc4f60294e5069b469404a2f251e6ebdb7bad200383bda38346895e318e WatchSource:0}: Error finding container b5676cc4f60294e5069b469404a2f251e6ebdb7bad200383bda38346895e318e: Status 404 returned error can't find the container with id b5676cc4f60294e5069b469404a2f251e6ebdb7bad200383bda38346895e318e Apr 17 20:58:28.508967 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:28.508899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" event={"ID":"430f1f78-633c-4f42-8a49-ab7ede4ff22e","Type":"ContainerStarted","Data":"b5676cc4f60294e5069b469404a2f251e6ebdb7bad200383bda38346895e318e"} Apr 17 20:58:31.520302 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:31.520261 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" event={"ID":"430f1f78-633c-4f42-8a49-ab7ede4ff22e","Type":"ContainerStarted","Data":"740dbe42ba306e5e1ca30a64a4344850c54294327553210bfe35a436759418bf"} Apr 17 20:58:31.538255 ip-10-0-132-12 kubenswrapper[2575]: I0417 20:58:31.538195 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-666889b9b6-xn2mq" podStartSLOduration=1.9261840220000002 podStartE2EDuration="4.538180827s" podCreationTimestamp="2026-04-17 20:58:27 +0000 UTC" firstStartedPulling="2026-04-17 20:58:27.907590065 +0000 UTC m=+452.375030976" lastFinishedPulling="2026-04-17 20:58:30.519586864 +0000 UTC m=+454.987027781" observedRunningTime="2026-04-17 20:58:31.535894636 +0000 UTC m=+456.003335570" watchObservedRunningTime="2026-04-17 20:58:31.538180827 +0000 UTC m=+456.005621762" Apr 17 21:00:08.128929 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.128863 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd"] Apr 17 21:00:08.132006 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.131984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.135092 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.135068 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:00:08.135202 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.135099 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 21:00:08.135520 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.135502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 21:00:08.135520 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.135514 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jn4tk\"" Apr 17 21:00:08.135673 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.135519 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:00:08.140768 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.140745 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd"] Apr 17 21:00:08.311303 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.311269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.311303 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.311316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4vh\" (UniqueName: \"kubernetes.io/projected/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-kube-api-access-rr4vh\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.311574 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.311389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.412861 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.412774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.413018 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.412863 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.413018 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.412882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4vh\" (UniqueName: \"kubernetes.io/projected/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-kube-api-access-rr4vh\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.413137 ip-10-0-132-12 kubenswrapper[2575]: E0417 21:00:08.413015 2575 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 21:00:08.413137 ip-10-0-132-12 kubenswrapper[2575]: E0417 21:00:08.413104 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-plugin-serving-cert podName:fcaa16a2-4e8c-469c-9a5d-f005550a09c4 nodeName:}" failed. No retries permitted until 2026-04-17 21:00:08.913081241 +0000 UTC m=+553.380522169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-qvsrd" (UID: "fcaa16a2-4e8c-469c-9a5d-f005550a09c4") : secret "plugin-serving-cert" not found Apr 17 21:00:08.413408 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.413390 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.428350 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.428324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4vh\" (UniqueName: \"kubernetes.io/projected/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-kube-api-access-rr4vh\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.917890 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.917858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:08.920332 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:08.920304 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcaa16a2-4e8c-469c-9a5d-f005550a09c4-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qvsrd\" (UID: \"fcaa16a2-4e8c-469c-9a5d-f005550a09c4\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:09.055923 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:09.055882 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" Apr 17 21:00:09.178498 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:09.178419 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd"] Apr 17 21:00:09.181181 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:00:09.181153 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcaa16a2_4e8c_469c_9a5d_f005550a09c4.slice/crio-af88179e2236ed360feb9d5ec553f23bad16a5dfff29d3eb095194678db8efcd WatchSource:0}: Error finding container af88179e2236ed360feb9d5ec553f23bad16a5dfff29d3eb095194678db8efcd: Status 404 returned error can't find the container with id af88179e2236ed360feb9d5ec553f23bad16a5dfff29d3eb095194678db8efcd Apr 17 21:00:09.836385 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:09.836355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" event={"ID":"fcaa16a2-4e8c-469c-9a5d-f005550a09c4","Type":"ContainerStarted","Data":"af88179e2236ed360feb9d5ec553f23bad16a5dfff29d3eb095194678db8efcd"} Apr 17 21:00:32.922663 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:32.922579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" event={"ID":"fcaa16a2-4e8c-469c-9a5d-f005550a09c4","Type":"ContainerStarted","Data":"26df481f94000b430312c03bbede92c9be60c338e10e44aaf312f11111e9e3ab"} Apr 17 21:00:32.941899 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:32.941849 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qvsrd" podStartSLOduration=1.467027486 podStartE2EDuration="24.941835341s" podCreationTimestamp="2026-04-17 21:00:08 +0000 UTC" firstStartedPulling="2026-04-17 21:00:09.182933051 +0000 UTC m=+553.650373965" lastFinishedPulling="2026-04-17 21:00:32.657740909 +0000 UTC m=+577.125181820" observedRunningTime="2026-04-17 21:00:32.941349908 +0000 UTC m=+577.408790852" watchObservedRunningTime="2026-04-17 21:00:32.941835341 +0000 UTC m=+577.409276274" Apr 17 21:00:51.131564 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.131505 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:00:51.180165 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.180123 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:00:51.180368 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.180289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.182631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.182608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 21:00:51.225588 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.225548 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:00:51.296747 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.296717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-config-file\") pod \"limitador-limitador-7d549b5b-gstmq\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.296927 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.296757 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ndfr\" (UniqueName: \"kubernetes.io/projected/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-kube-api-access-4ndfr\") pod \"limitador-limitador-7d549b5b-gstmq\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.397582 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.397491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-config-file\") pod \"limitador-limitador-7d549b5b-gstmq\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.397582 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.397548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ndfr\" (UniqueName: \"kubernetes.io/projected/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-kube-api-access-4ndfr\") pod \"limitador-limitador-7d549b5b-gstmq\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.398144 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.398123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-config-file\") pod \"limitador-limitador-7d549b5b-gstmq\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.405690 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.405657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ndfr\" (UniqueName: \"kubernetes.io/projected/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-kube-api-access-4ndfr\") pod \"limitador-limitador-7d549b5b-gstmq\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.489812 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.489777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:51.609399 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.609366 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:00:51.612992 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:00:51.612963 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e1ba7a4_14b6_425f_975e_6bcad9e7bd62.slice/crio-7ba28d8dd135ccff722fdb949ed44775df85faabb1c646d3eab6408407d75c5d WatchSource:0}: Error finding container 7ba28d8dd135ccff722fdb949ed44775df85faabb1c646d3eab6408407d75c5d: Status 404 returned error can't find the container with id 7ba28d8dd135ccff722fdb949ed44775df85faabb1c646d3eab6408407d75c5d Apr 17 21:00:51.986038 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:51.986000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" event={"ID":"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62","Type":"ContainerStarted","Data":"7ba28d8dd135ccff722fdb949ed44775df85faabb1c646d3eab6408407d75c5d"} Apr 17 21:00:52.141293 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.141260 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-w62h9"] Apr 17 21:00:52.145812 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.145796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:00:52.148337 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.148312 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nx2nz\"" Apr 17 21:00:52.149329 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.149306 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-w62h9"] Apr 17 21:00:52.204381 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.204352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5qtz\" (UniqueName: \"kubernetes.io/projected/173b49a2-b211-435a-b7cf-a26da7b4842b-kube-api-access-k5qtz\") pod \"authorino-7498df8756-w62h9\" (UID: \"173b49a2-b211-435a-b7cf-a26da7b4842b\") " pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:00:52.305122 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.305092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qtz\" (UniqueName: \"kubernetes.io/projected/173b49a2-b211-435a-b7cf-a26da7b4842b-kube-api-access-k5qtz\") pod \"authorino-7498df8756-w62h9\" (UID: \"173b49a2-b211-435a-b7cf-a26da7b4842b\") " pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:00:52.313309 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.313260 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5qtz\" (UniqueName: \"kubernetes.io/projected/173b49a2-b211-435a-b7cf-a26da7b4842b-kube-api-access-k5qtz\") pod \"authorino-7498df8756-w62h9\" (UID: \"173b49a2-b211-435a-b7cf-a26da7b4842b\") " pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:00:52.457602 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.457559 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:00:52.613762 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.613732 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-w62h9"] Apr 17 21:00:52.617461 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:00:52.617410 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173b49a2_b211_435a_b7cf_a26da7b4842b.slice/crio-24f59bcdf6e619b36192a479b922bb35efbc35f7cc11dc8221cdf086d8802f40 WatchSource:0}: Error finding container 24f59bcdf6e619b36192a479b922bb35efbc35f7cc11dc8221cdf086d8802f40: Status 404 returned error can't find the container with id 24f59bcdf6e619b36192a479b922bb35efbc35f7cc11dc8221cdf086d8802f40 Apr 17 21:00:52.990980 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:52.990900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-w62h9" event={"ID":"173b49a2-b211-435a-b7cf-a26da7b4842b","Type":"ContainerStarted","Data":"24f59bcdf6e619b36192a479b922bb35efbc35f7cc11dc8221cdf086d8802f40"} Apr 17 21:00:56.328927 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:56.328889 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 21:00:56.329301 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:56.328974 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 21:00:56.334929 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:56.334908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 21:00:56.335016 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:56.334915 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 21:00:57.007644 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:57.007609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-w62h9" event={"ID":"173b49a2-b211-435a-b7cf-a26da7b4842b","Type":"ContainerStarted","Data":"c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f"} Apr 17 21:00:57.008831 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:57.008807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" event={"ID":"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62","Type":"ContainerStarted","Data":"7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773"} Apr 17 21:00:57.008963 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:57.008954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:00:57.022262 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:57.022203 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-w62h9" podStartSLOduration=1.313442941 podStartE2EDuration="5.022188693s" podCreationTimestamp="2026-04-17 21:00:52 +0000 UTC" firstStartedPulling="2026-04-17 21:00:52.619703547 +0000 UTC m=+597.087144464" lastFinishedPulling="2026-04-17 21:00:56.328449302 +0000 UTC m=+600.795890216" observedRunningTime="2026-04-17 21:00:57.020868443 +0000 UTC m=+601.488309377" watchObservedRunningTime="2026-04-17 21:00:57.022188693 +0000 UTC m=+601.489629627" Apr 17 21:00:57.034911 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:00:57.034827 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" podStartSLOduration=1.3286778639999999 podStartE2EDuration="6.034811156s" podCreationTimestamp="2026-04-17 21:00:51 +0000 UTC" firstStartedPulling="2026-04-17 21:00:51.614642024 +0000 UTC m=+596.082082938" lastFinishedPulling="2026-04-17 21:00:56.320775305 +0000 UTC m=+600.788216230" observedRunningTime="2026-04-17 21:00:57.034682804 +0000 UTC m=+601.502123739" watchObservedRunningTime="2026-04-17 21:00:57.034811156 +0000 UTC m=+601.502252091" Apr 17 21:01:06.722939 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:06.722862 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:01:06.723361 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:06.723132 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" podUID="5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" containerName="limitador" containerID="cri-o://7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773" gracePeriod=30 Apr 17 21:01:06.723762 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:06.723726 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:01:07.260267 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.260244 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:01:07.341482 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.341407 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ndfr\" (UniqueName: \"kubernetes.io/projected/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-kube-api-access-4ndfr\") pod \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " Apr 17 21:01:07.341482 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.341453 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-config-file\") pod \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\" (UID: \"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62\") " Apr 17 21:01:07.341839 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.341814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-config-file" (OuterVolumeSpecName: "config-file") pod "5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" (UID: "5e1ba7a4-14b6-425f-975e-6bcad9e7bd62"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:01:07.343532 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.343510 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-kube-api-access-4ndfr" (OuterVolumeSpecName: "kube-api-access-4ndfr") pod "5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" (UID: "5e1ba7a4-14b6-425f-975e-6bcad9e7bd62"). InnerVolumeSpecName "kube-api-access-4ndfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:07.442732 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.442702 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ndfr\" (UniqueName: \"kubernetes.io/projected/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-kube-api-access-4ndfr\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 21:01:07.442732 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:07.442729 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62-config-file\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 21:01:08.048773 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.048740 2575 generic.go:358] "Generic (PLEG): container finished" podID="5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" containerID="7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773" exitCode=0 Apr 17 21:01:08.049190 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.048801 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" Apr 17 21:01:08.049190 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.048826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" event={"ID":"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62","Type":"ContainerDied","Data":"7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773"} Apr 17 21:01:08.049190 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.048864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-gstmq" event={"ID":"5e1ba7a4-14b6-425f-975e-6bcad9e7bd62","Type":"ContainerDied","Data":"7ba28d8dd135ccff722fdb949ed44775df85faabb1c646d3eab6408407d75c5d"} Apr 17 21:01:08.049190 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.048884 2575 scope.go:117] "RemoveContainer" containerID="7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773" Apr 17 21:01:08.057269 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.057250 2575 scope.go:117] "RemoveContainer" containerID="7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773" Apr 17 21:01:08.057506 ip-10-0-132-12 kubenswrapper[2575]: E0417 21:01:08.057486 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773\": container with ID starting with 7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773 not found: ID does not exist" containerID="7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773" Apr 17 21:01:08.057553 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.057516 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773"} err="failed to get container status \"7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773\": rpc error: code = NotFound desc = could not find container \"7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773\": container with ID starting with 7a4f5276217f30240b715217ebc2137569166896917415fd9bd8882ed092a773 not found: ID does not exist" Apr 17 21:01:08.068671 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.068648 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:01:08.072065 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.072042 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-gstmq"] Apr 17 21:01:08.134041 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:08.134007 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" path="/var/lib/kubelet/pods/5e1ba7a4-14b6-425f-975e-6bcad9e7bd62/volumes" Apr 17 21:01:12.205597 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.205565 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-qst6l"] Apr 17 21:01:12.208385 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.206416 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" containerName="limitador" Apr 17 21:01:12.208385 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.206450 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" containerName="limitador" Apr 17 21:01:12.208385 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.206742 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e1ba7a4-14b6-425f-975e-6bcad9e7bd62" containerName="limitador" Apr 17 21:01:12.211631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.211385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.216102 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.216073 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 21:01:12.216541 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.216327 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-6gctr\"" Apr 17 21:01:12.217852 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.217828 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qst6l"] Apr 17 21:01:12.280694 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.280667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwsrk\" (UniqueName: \"kubernetes.io/projected/fdc13cd2-31a3-4958-91a4-eba892565141-kube-api-access-bwsrk\") pod \"postgres-868db5846d-qst6l\" (UID: \"fdc13cd2-31a3-4958-91a4-eba892565141\") " pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.280848 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.280710 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fdc13cd2-31a3-4958-91a4-eba892565141-data\") pod \"postgres-868db5846d-qst6l\" (UID: \"fdc13cd2-31a3-4958-91a4-eba892565141\") " pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.381867 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.381838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwsrk\" (UniqueName: \"kubernetes.io/projected/fdc13cd2-31a3-4958-91a4-eba892565141-kube-api-access-bwsrk\") pod \"postgres-868db5846d-qst6l\" (UID: \"fdc13cd2-31a3-4958-91a4-eba892565141\") " pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.382027 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.381880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fdc13cd2-31a3-4958-91a4-eba892565141-data\") pod \"postgres-868db5846d-qst6l\" (UID: \"fdc13cd2-31a3-4958-91a4-eba892565141\") " pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.382194 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.382176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fdc13cd2-31a3-4958-91a4-eba892565141-data\") pod \"postgres-868db5846d-qst6l\" (UID: \"fdc13cd2-31a3-4958-91a4-eba892565141\") " pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.389714 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.389688 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwsrk\" (UniqueName: \"kubernetes.io/projected/fdc13cd2-31a3-4958-91a4-eba892565141-kube-api-access-bwsrk\") pod \"postgres-868db5846d-qst6l\" (UID: \"fdc13cd2-31a3-4958-91a4-eba892565141\") " pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.525569 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.525542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:12.645957 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:12.645928 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qst6l"] Apr 17 21:01:12.648377 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:01:12.648348 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc13cd2_31a3_4958_91a4_eba892565141.slice/crio-c4b5e23d7b73d6361c50496f6da948f6172f01512a574e5663d0cb0e147cf1e8 WatchSource:0}: Error finding container c4b5e23d7b73d6361c50496f6da948f6172f01512a574e5663d0cb0e147cf1e8: Status 404 returned error can't find the container with id c4b5e23d7b73d6361c50496f6da948f6172f01512a574e5663d0cb0e147cf1e8 Apr 17 21:01:13.068655 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:13.068614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qst6l" event={"ID":"fdc13cd2-31a3-4958-91a4-eba892565141","Type":"ContainerStarted","Data":"c4b5e23d7b73d6361c50496f6da948f6172f01512a574e5663d0cb0e147cf1e8"} Apr 17 21:01:20.099092 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:20.099057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qst6l" event={"ID":"fdc13cd2-31a3-4958-91a4-eba892565141","Type":"ContainerStarted","Data":"1fb8b46ce6717595fe233879763d5210c34f14c538e236366e7707c94382336f"} Apr 17 21:01:20.099572 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:20.099120 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:20.115596 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:20.115547 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-qst6l" podStartSLOduration=1.259641115 podStartE2EDuration="8.115532802s" podCreationTimestamp="2026-04-17 21:01:12 +0000 UTC" firstStartedPulling="2026-04-17 21:01:12.649716194 +0000 UTC m=+617.117157105" lastFinishedPulling="2026-04-17 21:01:19.505607864 +0000 UTC m=+623.973048792" observedRunningTime="2026-04-17 21:01:20.113215545 +0000 UTC m=+624.580656478" watchObservedRunningTime="2026-04-17 21:01:20.115532802 +0000 UTC m=+624.582973734" Apr 17 21:01:26.133802 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:26.133777 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-qst6l" Apr 17 21:01:27.034710 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.034665 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-768854978d-kj4f9"] Apr 17 21:01:27.042145 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.042124 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.044710 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.044686 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 21:01:27.046200 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.046165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-768854978d-kj4f9"] Apr 17 21:01:27.114202 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.114177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhsb2\" (UniqueName: \"kubernetes.io/projected/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-kube-api-access-hhsb2\") pod \"authorino-768854978d-kj4f9\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.114368 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.114251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-tls-cert\") pod \"authorino-768854978d-kj4f9\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.215133 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.215095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhsb2\" (UniqueName: \"kubernetes.io/projected/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-kube-api-access-hhsb2\") pod \"authorino-768854978d-kj4f9\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.215609 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.215157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-tls-cert\") pod \"authorino-768854978d-kj4f9\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.217560 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.217537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-tls-cert\") pod \"authorino-768854978d-kj4f9\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.222568 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.222544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhsb2\" (UniqueName: \"kubernetes.io/projected/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-kube-api-access-hhsb2\") pod \"authorino-768854978d-kj4f9\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.352273 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.352184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:01:27.466466 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:27.466435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-768854978d-kj4f9"] Apr 17 21:01:27.468369 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:01:27.468341 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b0fcb37_2ac7_4402_97ef_147e39cf5f37.slice/crio-42c6c9814f61e2a3da794b36176eadec2adaea4dd747e1a3751e482aee1c3e59 WatchSource:0}: Error finding container 42c6c9814f61e2a3da794b36176eadec2adaea4dd747e1a3751e482aee1c3e59: Status 404 returned error can't find the container with id 42c6c9814f61e2a3da794b36176eadec2adaea4dd747e1a3751e482aee1c3e59 Apr 17 21:01:28.126292 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.126258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-768854978d-kj4f9" event={"ID":"3b0fcb37-2ac7-4402-97ef-147e39cf5f37","Type":"ContainerStarted","Data":"d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93"} Apr 17 21:01:28.126292 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.126295 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-768854978d-kj4f9" event={"ID":"3b0fcb37-2ac7-4402-97ef-147e39cf5f37","Type":"ContainerStarted","Data":"42c6c9814f61e2a3da794b36176eadec2adaea4dd747e1a3751e482aee1c3e59"} Apr 17 21:01:28.139901 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.139854 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-768854978d-kj4f9" podStartSLOduration=0.712299743 podStartE2EDuration="1.139838363s" podCreationTimestamp="2026-04-17 21:01:27 +0000 UTC" firstStartedPulling="2026-04-17 21:01:27.469443964 +0000 UTC m=+631.936884878" lastFinishedPulling="2026-04-17 21:01:27.896982586 +0000 UTC m=+632.364423498" observedRunningTime="2026-04-17 21:01:28.138960971 +0000 UTC m=+632.606401915" watchObservedRunningTime="2026-04-17 21:01:28.139838363 +0000 UTC m=+632.607279296" Apr 17 21:01:28.162742 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.162669 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-w62h9"] Apr 17 21:01:28.162924 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.162900 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-w62h9" podUID="173b49a2-b211-435a-b7cf-a26da7b4842b" containerName="authorino" containerID="cri-o://c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f" gracePeriod=30 Apr 17 21:01:28.398430 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.398408 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:01:28.526460 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.526439 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5qtz\" (UniqueName: \"kubernetes.io/projected/173b49a2-b211-435a-b7cf-a26da7b4842b-kube-api-access-k5qtz\") pod \"173b49a2-b211-435a-b7cf-a26da7b4842b\" (UID: \"173b49a2-b211-435a-b7cf-a26da7b4842b\") " Apr 17 21:01:28.528478 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.528443 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173b49a2-b211-435a-b7cf-a26da7b4842b-kube-api-access-k5qtz" (OuterVolumeSpecName: "kube-api-access-k5qtz") pod "173b49a2-b211-435a-b7cf-a26da7b4842b" (UID: "173b49a2-b211-435a-b7cf-a26da7b4842b"). InnerVolumeSpecName "kube-api-access-k5qtz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:01:28.627408 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:28.627372 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5qtz\" (UniqueName: \"kubernetes.io/projected/173b49a2-b211-435a-b7cf-a26da7b4842b-kube-api-access-k5qtz\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 21:01:29.131051 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.131022 2575 generic.go:358] "Generic (PLEG): container finished" podID="173b49a2-b211-435a-b7cf-a26da7b4842b" containerID="c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f" exitCode=0 Apr 17 21:01:29.131255 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.131065 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-w62h9" Apr 17 21:01:29.131255 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.131101 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-w62h9" event={"ID":"173b49a2-b211-435a-b7cf-a26da7b4842b","Type":"ContainerDied","Data":"c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f"} Apr 17 21:01:29.131255 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.131138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-w62h9" event={"ID":"173b49a2-b211-435a-b7cf-a26da7b4842b","Type":"ContainerDied","Data":"24f59bcdf6e619b36192a479b922bb35efbc35f7cc11dc8221cdf086d8802f40"} Apr 17 21:01:29.131255 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.131159 2575 scope.go:117] "RemoveContainer" containerID="c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f" Apr 17 21:01:29.139022 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.139004 2575 scope.go:117] "RemoveContainer" containerID="c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f" Apr 17 21:01:29.139265 ip-10-0-132-12 kubenswrapper[2575]: E0417 21:01:29.139246 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f\": container with ID starting with c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f not found: ID does not exist" containerID="c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f" Apr 17 21:01:29.139336 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.139279 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f"} err="failed to get container status \"c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f\": rpc error: code = NotFound desc = could not find container \"c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f\": container with ID starting with c7f38c381db78596ed10e92e36b6df62ca868ab99f3ba87b180e410df2de273f not found: ID does not exist" Apr 17 21:01:29.155200 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.155176 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-w62h9"] Apr 17 21:01:29.159063 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:29.159043 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-w62h9"] Apr 17 21:01:30.133679 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:01:30.133647 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173b49a2-b211-435a-b7cf-a26da7b4842b" path="/var/lib/kubelet/pods/173b49a2-b211-435a-b7cf-a26da7b4842b/volumes" Apr 17 21:02:14.245755 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.245720 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-776d654dc-4x847"] Apr 17 21:02:14.246281 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.246238 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="173b49a2-b211-435a-b7cf-a26da7b4842b" containerName="authorino" Apr 17 21:02:14.246281 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.246260 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="173b49a2-b211-435a-b7cf-a26da7b4842b" containerName="authorino" Apr 17 21:02:14.246399 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.246362 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="173b49a2-b211-435a-b7cf-a26da7b4842b" containerName="authorino" Apr 17 21:02:14.249339 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.249320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.251927 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.251905 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-xjrkf\"" Apr 17 21:02:14.252031 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.251977 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 21:02:14.252955 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.252942 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 21:02:14.263474 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.259610 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-776d654dc-4x847"] Apr 17 21:02:14.311782 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.311755 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ddt\" (UniqueName: \"kubernetes.io/projected/1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f-kube-api-access-b6ddt\") pod \"maas-api-776d654dc-4x847\" (UID: \"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f\") " pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.311908 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.311788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f-maas-api-tls\") pod \"maas-api-776d654dc-4x847\" (UID: \"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f\") " pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.412882 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.412856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ddt\" (UniqueName: \"kubernetes.io/projected/1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f-kube-api-access-b6ddt\") pod \"maas-api-776d654dc-4x847\" (UID: \"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f\") " pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.413024 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.412889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f-maas-api-tls\") pod \"maas-api-776d654dc-4x847\" (UID: \"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f\") " pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.415152 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.415135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f-maas-api-tls\") pod \"maas-api-776d654dc-4x847\" (UID: \"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f\") " pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.421398 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.421370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ddt\" (UniqueName: \"kubernetes.io/projected/1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f-kube-api-access-b6ddt\") pod \"maas-api-776d654dc-4x847\" (UID: \"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f\") " pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.565069 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.565027 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:14.684599 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.684576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-776d654dc-4x847"] Apr 17 21:02:14.686885 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:02:14.686853 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c56f8bb_3f1e_4e6e_b250_7e27e18b6e2f.slice/crio-7dbfcb7ad18848586c83bba1aa3f4b7b4dc7c9419707f721fe890bcd165d835f WatchSource:0}: Error finding container 7dbfcb7ad18848586c83bba1aa3f4b7b4dc7c9419707f721fe890bcd165d835f: Status 404 returned error can't find the container with id 7dbfcb7ad18848586c83bba1aa3f4b7b4dc7c9419707f721fe890bcd165d835f Apr 17 21:02:14.688031 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:14.688016 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:02:15.282365 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:15.282326 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-776d654dc-4x847" event={"ID":"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f","Type":"ContainerStarted","Data":"7dbfcb7ad18848586c83bba1aa3f4b7b4dc7c9419707f721fe890bcd165d835f"} Apr 17 21:02:18.295851 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:18.295815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-776d654dc-4x847" event={"ID":"1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f","Type":"ContainerStarted","Data":"74a0b60d0bf66243900b4beb384f635acfa80faaba40cfa84839a1b7d389b5ae"} Apr 17 21:02:18.296259 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:18.295929 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:18.311884 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:18.311823 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-776d654dc-4x847" podStartSLOduration=1.718549407 podStartE2EDuration="4.311809871s" podCreationTimestamp="2026-04-17 21:02:14 +0000 UTC" firstStartedPulling="2026-04-17 21:02:14.6881727 +0000 UTC m=+679.155613617" lastFinishedPulling="2026-04-17 21:02:17.281433166 +0000 UTC m=+681.748874081" observedRunningTime="2026-04-17 21:02:18.310555769 +0000 UTC m=+682.777996702" watchObservedRunningTime="2026-04-17 21:02:18.311809871 +0000 UTC m=+682.779250804" Apr 17 21:02:24.305520 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:24.305491 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-776d654dc-4x847" Apr 17 21:02:37.672309 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.672272 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z"] Apr 17 21:02:37.674678 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.674663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.677234 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.677198 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 21:02:37.678554 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.678524 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-lj54k\"" Apr 17 21:02:37.678656 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.678564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 17 21:02:37.678656 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.678574 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 21:02:37.683311 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.683291 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z"] Apr 17 21:02:37.715494 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.715465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.715631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.715496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dzc\" (UniqueName: \"kubernetes.io/projected/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-kube-api-access-g8dzc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.715631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.715517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.715631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.715543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.715631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.715577 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.715631 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.715600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.816589 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.816554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.816589 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.816591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dzc\" (UniqueName: \"kubernetes.io/projected/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-kube-api-access-g8dzc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.816795 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.816614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.816795 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.816641 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.816795 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.816672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.816795 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.816698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.817085 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.817058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.817085 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.817075 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.817244 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.817108 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.818934 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.818907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.819060 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.819024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.828944 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.828906 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dzc\" (UniqueName: \"kubernetes.io/projected/d010eb83-ce9d-4164-9329-fa7b1f55a7a3-kube-api-access-g8dzc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z\" (UID: \"d010eb83-ce9d-4164-9329-fa7b1f55a7a3\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:37.985274 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:37.985177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:38.105682 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:38.105652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z"] Apr 17 21:02:38.107373 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:02:38.107346 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd010eb83_ce9d_4164_9329_fa7b1f55a7a3.slice/crio-dcf0ff92a2f9e0863eddbab073bf2c16409c0c6b8cab4af54d9bd8f5aa2b0844 WatchSource:0}: Error finding container dcf0ff92a2f9e0863eddbab073bf2c16409c0c6b8cab4af54d9bd8f5aa2b0844: Status 404 returned error can't find the container with id dcf0ff92a2f9e0863eddbab073bf2c16409c0c6b8cab4af54d9bd8f5aa2b0844 Apr 17 21:02:38.362917 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:38.362883 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" event={"ID":"d010eb83-ce9d-4164-9329-fa7b1f55a7a3","Type":"ContainerStarted","Data":"dcf0ff92a2f9e0863eddbab073bf2c16409c0c6b8cab4af54d9bd8f5aa2b0844"} Apr 17 21:02:43.382900 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:43.382797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" event={"ID":"d010eb83-ce9d-4164-9329-fa7b1f55a7a3","Type":"ContainerStarted","Data":"c4cc0cd3abf2551701b6a02a9ab2327d1ba704e3c02db8fde04042e4c08f9960"} Apr 17 21:02:49.403898 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:49.403859 2575 generic.go:358] "Generic (PLEG): container finished" podID="d010eb83-ce9d-4164-9329-fa7b1f55a7a3" containerID="c4cc0cd3abf2551701b6a02a9ab2327d1ba704e3c02db8fde04042e4c08f9960" exitCode=0 Apr 17 21:02:49.404349 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:49.403933 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" event={"ID":"d010eb83-ce9d-4164-9329-fa7b1f55a7a3","Type":"ContainerDied","Data":"c4cc0cd3abf2551701b6a02a9ab2327d1ba704e3c02db8fde04042e4c08f9960"} Apr 17 21:02:51.413811 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:51.413778 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" event={"ID":"d010eb83-ce9d-4164-9329-fa7b1f55a7a3","Type":"ContainerStarted","Data":"9859c8815ff97941f28774c876b2d736e0ffdfd857240ab48d2677ee307ab6cb"} Apr 17 21:02:51.414181 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:51.413997 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:02:51.430557 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:02:51.430510 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" podStartSLOduration=2.124085748 podStartE2EDuration="14.430496152s" podCreationTimestamp="2026-04-17 21:02:37 +0000 UTC" firstStartedPulling="2026-04-17 21:02:38.109068228 +0000 UTC m=+702.576509143" lastFinishedPulling="2026-04-17 21:02:50.415478629 +0000 UTC m=+714.882919547" observedRunningTime="2026-04-17 21:02:51.429140568 +0000 UTC m=+715.896581500" watchObservedRunningTime="2026-04-17 21:02:51.430496152 +0000 UTC m=+715.897937084" Apr 17 21:03:02.429731 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:02.429701 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z" Apr 17 21:03:21.072358 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.072322 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt"] Apr 17 21:03:21.105025 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.104989 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt"] Apr 17 21:03:21.105174 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.105134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.107708 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.107687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 17 21:03:21.202775 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.202744 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.202775 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.202775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqc9h\" (UniqueName: \"kubernetes.io/projected/50327345-d8b4-472c-b163-ba985abc4451-kube-api-access-nqc9h\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.202990 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.202863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.202990 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.202893 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.202990 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.202910 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50327345-d8b4-472c-b163-ba985abc4451-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.202990 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.202932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304029 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.303992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304029 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304318 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50327345-d8b4-472c-b163-ba985abc4451-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304318 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304318 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304318 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqc9h\" (UniqueName: \"kubernetes.io/projected/50327345-d8b4-472c-b163-ba985abc4451-kube-api-access-nqc9h\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304598 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304674 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.304760 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.304740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.306499 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.306469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/50327345-d8b4-472c-b163-ba985abc4451-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.306722 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.306704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/50327345-d8b4-472c-b163-ba985abc4451-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.312278 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.312257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqc9h\" (UniqueName: \"kubernetes.io/projected/50327345-d8b4-472c-b163-ba985abc4451-kube-api-access-nqc9h\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt\" (UID: \"50327345-d8b4-472c-b163-ba985abc4451\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.415136 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.415048 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:21.536764 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:21.536739 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt"] Apr 17 21:03:21.539278 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:03:21.539252 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50327345_d8b4_472c_b163_ba985abc4451.slice/crio-c893ed98bdfa7acae915ad5c9f4e200fa8b25e83de5358d80a292bcef06645ec WatchSource:0}: Error finding container c893ed98bdfa7acae915ad5c9f4e200fa8b25e83de5358d80a292bcef06645ec: Status 404 returned error can't find the container with id c893ed98bdfa7acae915ad5c9f4e200fa8b25e83de5358d80a292bcef06645ec Apr 17 21:03:22.525616 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:22.525577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" event={"ID":"50327345-d8b4-472c-b163-ba985abc4451","Type":"ContainerStarted","Data":"9237aa6c33384ac1bb87d6fd2aaae278f0f890772d25fcd0906664b98e5aebb8"} Apr 17 21:03:22.525616 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:22.525615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" event={"ID":"50327345-d8b4-472c-b163-ba985abc4451","Type":"ContainerStarted","Data":"c893ed98bdfa7acae915ad5c9f4e200fa8b25e83de5358d80a292bcef06645ec"} Apr 17 21:03:27.544974 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:27.544937 2575 generic.go:358] "Generic (PLEG): container finished" podID="50327345-d8b4-472c-b163-ba985abc4451" containerID="9237aa6c33384ac1bb87d6fd2aaae278f0f890772d25fcd0906664b98e5aebb8" exitCode=0 Apr 17 21:03:27.545500 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:27.545012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" event={"ID":"50327345-d8b4-472c-b163-ba985abc4451","Type":"ContainerDied","Data":"9237aa6c33384ac1bb87d6fd2aaae278f0f890772d25fcd0906664b98e5aebb8"} Apr 17 21:03:28.550382 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:28.550347 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" event={"ID":"50327345-d8b4-472c-b163-ba985abc4451","Type":"ContainerStarted","Data":"814493b8b2e7fd256ed44e5d050220fde3e537dcaf7d2a2673b3e179bbc6e2a0"} Apr 17 21:03:28.550793 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:28.550549 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:28.568624 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:28.568576 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" podStartSLOduration=7.375382046 podStartE2EDuration="7.568561141s" podCreationTimestamp="2026-04-17 21:03:21 +0000 UTC" firstStartedPulling="2026-04-17 21:03:27.545771862 +0000 UTC m=+752.013212772" lastFinishedPulling="2026-04-17 21:03:27.73895095 +0000 UTC m=+752.206391867" observedRunningTime="2026-04-17 21:03:28.56790643 +0000 UTC m=+753.035347363" watchObservedRunningTime="2026-04-17 21:03:28.568561141 +0000 UTC m=+753.036002073" Apr 17 21:03:39.566405 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:39.566371 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt" Apr 17 21:03:55.751733 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:55.751697 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-768854978d-kj4f9"] Apr 17 21:03:55.752272 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:55.751950 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-768854978d-kj4f9" podUID="3b0fcb37-2ac7-4402-97ef-147e39cf5f37" containerName="authorino" containerID="cri-o://d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93" gracePeriod=30 Apr 17 21:03:55.997591 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:55.997567 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:03:56.112783 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.112755 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-tls-cert\") pod \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " Apr 17 21:03:56.112938 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.112805 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhsb2\" (UniqueName: \"kubernetes.io/projected/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-kube-api-access-hhsb2\") pod \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\" (UID: \"3b0fcb37-2ac7-4402-97ef-147e39cf5f37\") " Apr 17 21:03:56.114702 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.114674 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-kube-api-access-hhsb2" (OuterVolumeSpecName: "kube-api-access-hhsb2") pod "3b0fcb37-2ac7-4402-97ef-147e39cf5f37" (UID: "3b0fcb37-2ac7-4402-97ef-147e39cf5f37"). InnerVolumeSpecName "kube-api-access-hhsb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:03:56.122212 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.122184 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "3b0fcb37-2ac7-4402-97ef-147e39cf5f37" (UID: "3b0fcb37-2ac7-4402-97ef-147e39cf5f37"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:03:56.214015 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.213980 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-tls-cert\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 21:03:56.214015 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.214012 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhsb2\" (UniqueName: \"kubernetes.io/projected/3b0fcb37-2ac7-4402-97ef-147e39cf5f37-kube-api-access-hhsb2\") on node \"ip-10-0-132-12.ec2.internal\" DevicePath \"\"" Apr 17 21:03:56.646516 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.646480 2575 generic.go:358] "Generic (PLEG): container finished" podID="3b0fcb37-2ac7-4402-97ef-147e39cf5f37" containerID="d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93" exitCode=0 Apr 17 21:03:56.646705 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.646530 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-768854978d-kj4f9" Apr 17 21:03:56.646705 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.646564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-768854978d-kj4f9" event={"ID":"3b0fcb37-2ac7-4402-97ef-147e39cf5f37","Type":"ContainerDied","Data":"d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93"} Apr 17 21:03:56.646705 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.646603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-768854978d-kj4f9" event={"ID":"3b0fcb37-2ac7-4402-97ef-147e39cf5f37","Type":"ContainerDied","Data":"42c6c9814f61e2a3da794b36176eadec2adaea4dd747e1a3751e482aee1c3e59"} Apr 17 21:03:56.646705 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.646622 2575 scope.go:117] "RemoveContainer" containerID="d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93" Apr 17 21:03:56.655897 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.655878 2575 scope.go:117] "RemoveContainer" containerID="d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93" Apr 17 21:03:56.656140 ip-10-0-132-12 kubenswrapper[2575]: E0417 21:03:56.656122 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93\": container with ID starting with d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93 not found: ID does not exist" containerID="d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93" Apr 17 21:03:56.656197 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.656148 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93"} err="failed to get container status \"d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93\": rpc error: code = NotFound desc = could not find container \"d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93\": container with ID starting with d328b1b0905925ebdc923b8087b22cdb1a8caab101a1bfebece5d70dfdbb0c93 not found: ID does not exist" Apr 17 21:03:56.666113 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.666081 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-768854978d-kj4f9"] Apr 17 21:03:56.667630 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:56.667610 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-768854978d-kj4f9"] Apr 17 21:03:58.133793 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:03:58.133758 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b0fcb37-2ac7-4402-97ef-147e39cf5f37" path="/var/lib/kubelet/pods/3b0fcb37-2ac7-4402-97ef-147e39cf5f37/volumes" Apr 17 21:05:56.353658 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:05:56.353627 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 21:05:56.355417 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:05:56.355391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 21:05:56.358848 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:05:56.358831 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 21:05:56.363990 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:05:56.363972 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 21:06:11.804073 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:11.804039 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-776d654dc-4x847_1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f/maas-api/0.log" Apr 17 21:06:12.236494 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:12.236411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq_ec6b4346-066b-4f55-a073-e6797324b990/manager/0.log" Apr 17 21:06:12.451691 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:12.451659 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qst6l_fdc13cd2-31a3-4958-91a4-eba892565141/postgres/0.log" Apr 17 21:06:13.938273 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:13.938246 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-qvsrd_fcaa16a2-4e8c-469c-9a5d-f005550a09c4/kuadrant-console-plugin/0.log" Apr 17 21:06:15.006303 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:15.006278 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-666889b9b6-xn2mq_430f1f78-633c-4f42-8a49-ab7ede4ff22e/kube-auth-proxy/0.log" Apr 17 21:06:15.217644 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:15.217615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-75858d479b-s6j25_591063be-61cb-4346-b946-b6ad0d833153/router/0.log" Apr 17 21:06:15.540034 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:15.540006 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z_d010eb83-ce9d-4164-9329-fa7b1f55a7a3/storage-initializer/0.log" Apr 17 21:06:15.562125 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:15.562100 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-pjf4z_d010eb83-ce9d-4164-9329-fa7b1f55a7a3/main/0.log" Apr 17 21:06:15.997443 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:15.997364 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt_50327345-d8b4-472c-b163-ba985abc4451/storage-initializer/0.log" Apr 17 21:06:16.003427 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:16.003406 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-4l9rt_50327345-d8b4-472c-b163-ba985abc4451/main/0.log" Apr 17 21:06:19.733603 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.733559 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mxv2s/must-gather-ssclh"] Apr 17 21:06:19.733988 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.733938 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b0fcb37-2ac7-4402-97ef-147e39cf5f37" containerName="authorino" Apr 17 21:06:19.733988 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.733948 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b0fcb37-2ac7-4402-97ef-147e39cf5f37" containerName="authorino" Apr 17 21:06:19.734070 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.734018 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b0fcb37-2ac7-4402-97ef-147e39cf5f37" containerName="authorino" Apr 17 21:06:19.736842 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.736825 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:19.739331 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.739305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mxv2s\"/\"kube-root-ca.crt\"" Apr 17 21:06:19.740243 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.740192 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mxv2s\"/\"default-dockercfg-kt5z7\"" Apr 17 21:06:19.740243 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.740204 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mxv2s\"/\"openshift-service-ca.crt\"" Apr 17 21:06:19.751388 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.751366 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/must-gather-ssclh"] Apr 17 21:06:19.856887 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.856852 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/208c4650-8a40-411e-9b95-6f22252638fc-must-gather-output\") pod \"must-gather-ssclh\" (UID: \"208c4650-8a40-411e-9b95-6f22252638fc\") " pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:19.856887 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.856885 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4v7\" (UniqueName: \"kubernetes.io/projected/208c4650-8a40-411e-9b95-6f22252638fc-kube-api-access-sp4v7\") pod \"must-gather-ssclh\" (UID: \"208c4650-8a40-411e-9b95-6f22252638fc\") " pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:19.958208 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.958170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/208c4650-8a40-411e-9b95-6f22252638fc-must-gather-output\") pod \"must-gather-ssclh\" (UID: \"208c4650-8a40-411e-9b95-6f22252638fc\") " pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:19.958208 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.958208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4v7\" (UniqueName: \"kubernetes.io/projected/208c4650-8a40-411e-9b95-6f22252638fc-kube-api-access-sp4v7\") pod \"must-gather-ssclh\" (UID: \"208c4650-8a40-411e-9b95-6f22252638fc\") " pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:19.958528 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.958509 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/208c4650-8a40-411e-9b95-6f22252638fc-must-gather-output\") pod \"must-gather-ssclh\" (UID: \"208c4650-8a40-411e-9b95-6f22252638fc\") " pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:19.966072 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:19.966046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4v7\" (UniqueName: \"kubernetes.io/projected/208c4650-8a40-411e-9b95-6f22252638fc-kube-api-access-sp4v7\") pod \"must-gather-ssclh\" (UID: \"208c4650-8a40-411e-9b95-6f22252638fc\") " pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:20.046065 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:20.046031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/must-gather-ssclh" Apr 17 21:06:20.164505 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:20.164480 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/must-gather-ssclh"] Apr 17 21:06:20.168107 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:06:20.168079 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod208c4650_8a40_411e_9b95_6f22252638fc.slice/crio-6e40b51932bf7bd779cd3da4ad688a2a5287e4bf9dd6741525e244a2415f7287 WatchSource:0}: Error finding container 6e40b51932bf7bd779cd3da4ad688a2a5287e4bf9dd6741525e244a2415f7287: Status 404 returned error can't find the container with id 6e40b51932bf7bd779cd3da4ad688a2a5287e4bf9dd6741525e244a2415f7287 Apr 17 21:06:21.142943 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:21.142904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/must-gather-ssclh" event={"ID":"208c4650-8a40-411e-9b95-6f22252638fc","Type":"ContainerStarted","Data":"2e839e1388519e289a5fa99831b8b8e57ba2ebb0773a38b126bb296bd40f1220"} Apr 17 21:06:21.143351 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:21.142953 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/must-gather-ssclh" event={"ID":"208c4650-8a40-411e-9b95-6f22252638fc","Type":"ContainerStarted","Data":"6e40b51932bf7bd779cd3da4ad688a2a5287e4bf9dd6741525e244a2415f7287"} Apr 17 21:06:22.149940 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:22.149900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/must-gather-ssclh" event={"ID":"208c4650-8a40-411e-9b95-6f22252638fc","Type":"ContainerStarted","Data":"8edbd8cf8fc1ef3b38ca8f2e9c917b8d5a00f531a8caa0b1f43517a2424b7726"} Apr 17 21:06:22.168669 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:22.168623 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mxv2s/must-gather-ssclh" podStartSLOduration=2.340879639 podStartE2EDuration="3.168608789s" podCreationTimestamp="2026-04-17 21:06:19 +0000 UTC" firstStartedPulling="2026-04-17 21:06:20.169885186 +0000 UTC m=+924.637326096" lastFinishedPulling="2026-04-17 21:06:20.997614334 +0000 UTC m=+925.465055246" observedRunningTime="2026-04-17 21:06:22.166501877 +0000 UTC m=+926.633942810" watchObservedRunningTime="2026-04-17 21:06:22.168608789 +0000 UTC m=+926.636049722" Apr 17 21:06:22.646244 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:22.646198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-z7q6j_c6a655f1-43b7-4fec-a1cc-63ab69f29b85/global-pull-secret-syncer/0.log" Apr 17 21:06:22.690458 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:22.690422 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-c796r_743a363d-0753-4e3b-9c99-a494c15dcf32/konnectivity-agent/0.log" Apr 17 21:06:22.808412 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:22.808376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-12.ec2.internal_dbe6d7617e05cfcc664fd92a25ec45f3/haproxy/0.log" Apr 17 21:06:26.670383 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:26.670343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-qvsrd_fcaa16a2-4e8c-469c-9a5d-f005550a09c4/kuadrant-console-plugin/0.log" Apr 17 21:06:28.212319 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.212282 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-ktp5n_99cf7a02-6c85-4f4e-b81d-d984d3c34a8c/cluster-monitoring-operator/0.log" Apr 17 21:06:28.312805 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.312651 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-78d578744-sbljb_3356598d-23e5-49f0-bc1c-f41ca236c15b/metrics-server/0.log" Apr 17 21:06:28.546388 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.546361 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qxw7b_fc03518e-8a07-44bc-af57-c83515d5fec6/node-exporter/0.log" Apr 17 21:06:28.575533 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.575452 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qxw7b_fc03518e-8a07-44bc-af57-c83515d5fec6/kube-rbac-proxy/0.log" Apr 17 21:06:28.604248 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.603737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-qxw7b_fc03518e-8a07-44bc-af57-c83515d5fec6/init-textfile/0.log" Apr 17 21:06:28.710875 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.710812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/prometheus/0.log" Apr 17 21:06:28.734720 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.734689 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/config-reloader/0.log" Apr 17 21:06:28.754787 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.754722 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/thanos-sidecar/0.log" Apr 17 21:06:28.778392 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.778367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/kube-rbac-proxy-web/0.log" Apr 17 21:06:28.802653 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.802613 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/kube-rbac-proxy/0.log" Apr 17 21:06:28.825134 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.825103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/kube-rbac-proxy-thanos/0.log" Apr 17 21:06:28.848152 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.848068 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_3b1d1ae5-4f52-46c2-ae87-f337988640e3/init-config-reloader/0.log" Apr 17 21:06:28.894353 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.894312 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-wffpx_daeab26f-6949-4dda-89cb-d8f7129704e2/prometheus-operator/0.log" Apr 17 21:06:28.914545 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:28.914512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-wffpx_daeab26f-6949-4dda-89cb-d8f7129704e2/kube-rbac-proxy/0.log" Apr 17 21:06:29.053531 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:29.053502 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f9d48c748-tvrz7_2e4efe81-da2d-4b12-9682-50c7aa8b69fd/thanos-query/0.log" Apr 17 21:06:29.074441 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:29.074411 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f9d48c748-tvrz7_2e4efe81-da2d-4b12-9682-50c7aa8b69fd/kube-rbac-proxy-web/0.log" Apr 17 21:06:29.102453 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:29.102379 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f9d48c748-tvrz7_2e4efe81-da2d-4b12-9682-50c7aa8b69fd/kube-rbac-proxy/0.log" Apr 17 21:06:29.126944 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:29.126907 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f9d48c748-tvrz7_2e4efe81-da2d-4b12-9682-50c7aa8b69fd/prom-label-proxy/0.log" Apr 17 21:06:29.148797 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:29.148774 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f9d48c748-tvrz7_2e4efe81-da2d-4b12-9682-50c7aa8b69fd/kube-rbac-proxy-rules/0.log" Apr 17 21:06:29.173892 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:29.173864 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-f9d48c748-tvrz7_2e4efe81-da2d-4b12-9682-50c7aa8b69fd/kube-rbac-proxy-metrics/0.log" Apr 17 21:06:30.887695 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:30.887667 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/2.log" Apr 17 21:06:30.891934 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:30.891908 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-v47zc_d30ac7a2-edef-43e4-a645-fbf9445df632/console-operator/3.log" Apr 17 21:06:31.417384 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.417347 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx"] Apr 17 21:06:31.425031 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.424995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.429021 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.428995 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx"] Apr 17 21:06:31.578796 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.578758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-lib-modules\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.578966 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.578880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-sys\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.578966 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.578907 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6px7b\" (UniqueName: \"kubernetes.io/projected/864b3e26-0614-4863-ad90-5ae9ab76c76f-kube-api-access-6px7b\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.579061 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.578974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-podres\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.579061 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.579030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-proc\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680319 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-sys\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680319 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680290 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6px7b\" (UniqueName: \"kubernetes.io/projected/864b3e26-0614-4863-ad90-5ae9ab76c76f-kube-api-access-6px7b\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680527 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-podres\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680527 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-proc\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680527 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680448 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-lib-modules\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680723 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-lib-modules\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.680970 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.680939 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-sys\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.681389 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.681361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-podres\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.681389 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.681377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/864b3e26-0614-4863-ad90-5ae9ab76c76f-proc\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.689355 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.689330 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6px7b\" (UniqueName: \"kubernetes.io/projected/864b3e26-0614-4863-ad90-5ae9ab76c76f-kube-api-access-6px7b\") pod \"perf-node-gather-daemonset-5zsxx\" (UID: \"864b3e26-0614-4863-ad90-5ae9ab76c76f\") " pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.745466 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.745429 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:31.914934 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.914911 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx"] Apr 17 21:06:31.917478 ip-10-0-132-12 kubenswrapper[2575]: W0417 21:06:31.917444 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod864b3e26_0614_4863_ad90_5ae9ab76c76f.slice/crio-298941f56423c97ae2bcae7713649b06c7d71a8a4c8a516de10550457a456ec7 WatchSource:0}: Error finding container 298941f56423c97ae2bcae7713649b06c7d71a8a4c8a516de10550457a456ec7: Status 404 returned error can't find the container with id 298941f56423c97ae2bcae7713649b06c7d71a8a4c8a516de10550457a456ec7 Apr 17 21:06:31.927845 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:31.927823 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-svg9p_4a34696e-71c8-4053-bb98-047cc6c77df2/volume-data-source-validator/0.log" Apr 17 21:06:32.203338 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.203235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" event={"ID":"864b3e26-0614-4863-ad90-5ae9ab76c76f","Type":"ContainerStarted","Data":"ad18ee6c8c072cf938bf7bd2e12f15bdfc3fe55ce581d0f791082af6c5826351"} Apr 17 21:06:32.203338 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.203283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" event={"ID":"864b3e26-0614-4863-ad90-5ae9ab76c76f","Type":"ContainerStarted","Data":"298941f56423c97ae2bcae7713649b06c7d71a8a4c8a516de10550457a456ec7"} Apr 17 21:06:32.204013 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.203986 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:32.220626 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.220571 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" podStartSLOduration=1.220553464 podStartE2EDuration="1.220553464s" podCreationTimestamp="2026-04-17 21:06:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:06:32.218647261 +0000 UTC m=+936.686088194" watchObservedRunningTime="2026-04-17 21:06:32.220553464 +0000 UTC m=+936.687994398" Apr 17 21:06:32.799002 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.798973 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qbn97_65c9c4b6-99be-4373-933b-d44dfd308d32/dns/0.log" Apr 17 21:06:32.827526 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.827499 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-qbn97_65c9c4b6-99be-4373-933b-d44dfd308d32/kube-rbac-proxy/0.log" Apr 17 21:06:32.894379 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:32.894354 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7rtzd_2d2c4aed-da8f-4e54-8d9e-75c4dc6a6236/dns-node-resolver/0.log" Apr 17 21:06:33.428120 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:33.428087 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-ddpb6_5f57b5b6-848e-48b0-b1ca-e8a8c29c446c/node-ca/0.log" Apr 17 21:06:34.368269 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:34.368179 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-666889b9b6-xn2mq_430f1f78-633c-4f42-8a49-ab7ede4ff22e/kube-auth-proxy/0.log" Apr 17 21:06:34.415120 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:34.415094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-75858d479b-s6j25_591063be-61cb-4346-b946-b6ad0d833153/router/0.log" Apr 17 21:06:34.920713 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:34.920683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rxb59_b8b94abe-b5ba-4b0f-ae1f-63575ffbb062/serve-healthcheck-canary/0.log" Apr 17 21:06:35.439941 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:35.439911 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6jl5b_5a0eab0c-9c9b-4c99-9117-ed93ab036378/kube-rbac-proxy/0.log" Apr 17 21:06:35.461718 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:35.461690 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6jl5b_5a0eab0c-9c9b-4c99-9117-ed93ab036378/exporter/0.log" Apr 17 21:06:35.484689 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:35.484666 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6jl5b_5a0eab0c-9c9b-4c99-9117-ed93ab036378/extractor/0.log" Apr 17 21:06:37.357411 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:37.357381 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-776d654dc-4x847_1c56f8bb-3f1e-4e6e-b250-7e27e18b6e2f/maas-api/0.log" Apr 17 21:06:37.458916 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:37.458884 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5f74b9c8f9-kbgxq_ec6b4346-066b-4f55-a073-e6797324b990/manager/0.log" Apr 17 21:06:37.515405 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:37.515374 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qst6l_fdc13cd2-31a3-4958-91a4-eba892565141/postgres/0.log" Apr 17 21:06:38.585628 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:38.585602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7bd8bcccff-m5r5x_145770d6-42bc-4dfd-8af6-3775582eb974/manager/0.log" Apr 17 21:06:39.221216 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:39.221192 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mxv2s/perf-node-gather-daemonset-5zsxx" Apr 17 21:06:44.255438 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.255390 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-47m56_6560b30d-aca7-45a9-b8b5-3fb4711c4650/kube-multus/0.log" Apr 17 21:06:44.602317 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.602203 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/kube-multus-additional-cni-plugins/0.log" Apr 17 21:06:44.621335 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.621310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/egress-router-binary-copy/0.log" Apr 17 21:06:44.641448 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.641424 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/cni-plugins/0.log" Apr 17 21:06:44.661511 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.661482 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/bond-cni-plugin/0.log" Apr 17 21:06:44.679864 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.679837 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/routeoverride-cni/0.log" Apr 17 21:06:44.699383 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.699351 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/whereabouts-cni-bincopy/0.log" Apr 17 21:06:44.719490 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.719465 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tgvzd_38d8ca4c-2571-491a-bb52-21191288881d/whereabouts-cni/0.log" Apr 17 21:06:44.846767 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.846742 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gn7zb_03974a46-a9e1-4161-8f82-8e72fdfcb759/network-metrics-daemon/0.log" Apr 17 21:06:44.868650 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:44.868585 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gn7zb_03974a46-a9e1-4161-8f82-8e72fdfcb759/kube-rbac-proxy/0.log" Apr 17 21:06:45.954001 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:45.953970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-controller/0.log" Apr 17 21:06:46.035792 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.035770 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/0.log" Apr 17 21:06:46.040238 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.040202 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovn-acl-logging/1.log" Apr 17 21:06:46.064243 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.064203 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/kube-rbac-proxy-node/0.log" Apr 17 21:06:46.085189 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.085166 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 21:06:46.104660 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.104641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/northd/0.log" Apr 17 21:06:46.123946 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.123926 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/nbdb/0.log" Apr 17 21:06:46.143318 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.143296 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/sbdb/0.log" Apr 17 21:06:46.254046 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:46.253952 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8fq5_e7fec351-05e2-48e2-8266-8f2093ebb3fe/ovnkube-controller/0.log" Apr 17 21:06:47.574368 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:47.574336 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4lcdx_8b78fb10-ec75-4579-95cd-a89556a6bc0f/check-endpoints/0.log" Apr 17 21:06:47.645095 ip-10-0-132-12 kubenswrapper[2575]: I0417 21:06:47.645071 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zgp64_5c537771-95e6-4644-8ce6-c3997543ce01/network-check-target-container/0.log"