Apr 20 19:07:54.963374 ip-10-0-134-63 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:07:55.424163 ip-10-0-134-63 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:07:55.424163 ip-10-0-134-63 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:07:55.424163 ip-10-0-134-63 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:07:55.424163 ip-10-0-134-63 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:07:55.424163 ip-10-0-134-63 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:07:55.425729 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.425639 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:07:55.431027 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431011 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:07:55.431027 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431027 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431031 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431035 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431038 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431040 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431043 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431046 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431061 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431064 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431068 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431070 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431073 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431076 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431078 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431082 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431084 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431087 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431090 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431092 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431095 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431098 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:07:55.431093 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431100 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431115 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431118 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431121 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431123 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431127 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431131 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431134 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431137 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431140 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431142 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431145 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431148 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431151 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431154 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431157 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431161 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431165 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431168 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:07:55.431606 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431171 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431174 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431176 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431179 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431181 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431184 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431186 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431190 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431192 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431195 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431197 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431200 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431202 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431205 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431207 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431210 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431213 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431219 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431222 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:07:55.432102 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431224 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431226 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431229 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431231 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431234 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431236 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431239 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431241 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431244 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431246 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431249 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431251 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431254 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431256 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431259 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431262 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431264 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431267 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431269 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431272 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:07:55.432581 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431275 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431278 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431281 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431283 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431286 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431289 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431689 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431694 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431698 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431701 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431704 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431707 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431710 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431712 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431715 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431717 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431720 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431723 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431725 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431728 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:07:55.433057 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431731 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431733 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431736 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431738 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431741 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431743 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431746 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431749 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431751 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431754 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431757 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431759 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431762 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431764 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431767 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431770 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431772 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431775 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431777 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:07:55.433552 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431780 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431783 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431786 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431789 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431792 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431794 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431797 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431799 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431802 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431804 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431807 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431809 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431812 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431814 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431817 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431820 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431824 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431827 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431830 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431832 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:07:55.434034 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431835 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431837 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431840 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431842 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431845 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431848 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431850 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431853 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431855 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431858 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431860 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431863 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431865 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431868 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431870 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431873 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431875 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431877 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431880 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431882 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:07:55.434577 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431885 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431887 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431889 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431893 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431896 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431900 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431903 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431906 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431909 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431912 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431915 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431917 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.431920 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433274 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433286 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433292 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433297 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433301 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433305 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433309 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433314 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:07:55.435060 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433317 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433320 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433324 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433328 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433331 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433334 2571 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433337 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433340 2571 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433343 2571 flags.go:64] FLAG: --cloud-config="" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433345 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433349 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433353 2571 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433356 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433359 2571 flags.go:64] FLAG: --config-dir="" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433362 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433366 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433370 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433373 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433376 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433379 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433382 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433386 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433389 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433392 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433395 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:07:55.435589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433401 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433404 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433407 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433410 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433413 2571 flags.go:64] FLAG: --enable-server="true" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433416 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433420 2571 flags.go:64] FLAG: --event-burst="100" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433424 2571 flags.go:64] FLAG: --event-qps="50" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433426 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433429 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433432 2571 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433436 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433439 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433442 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433445 2571 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433448 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433451 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433454 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433457 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433460 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433463 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433465 2571 flags.go:64] FLAG: --feature-gates="" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433469 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433472 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433475 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:07:55.436299 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433478 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433482 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433485 2571 flags.go:64] FLAG: --help="false" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433488 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433491 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433494 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433497 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433501 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433504 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433507 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433510 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433513 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433516 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433519 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433522 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433525 2571 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433528 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433531 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433535 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433538 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433541 2571 flags.go:64] FLAG: --lock-file="" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433543 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433548 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433551 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:07:55.436906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433556 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433559 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433562 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433565 2571 flags.go:64] FLAG: --logging-format="text" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433568 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433571 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433574 2571 flags.go:64] FLAG: --manifest-url="" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433577 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433581 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433585 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433589 2571 flags.go:64] FLAG: --max-pods="110" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433592 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433595 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433598 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433602 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433605 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433607 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433611 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433618 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433622 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433625 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433628 2571 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433631 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:07:55.437495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433637 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433640 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433643 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433646 2571 flags.go:64] FLAG: --port="10250" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433649 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433652 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04696cce423324bb4" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433656 2571 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433659 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433662 2571 flags.go:64] FLAG: --register-node="true" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433664 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433667 2571 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433671 2571 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433674 2571 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433677 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433679 2571 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433683 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433686 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433689 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433692 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433695 2571 flags.go:64] FLAG: --runonce="false" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433697 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433700 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433704 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433707 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433710 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433713 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:07:55.438056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433716 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433719 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433722 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433725 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433728 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433731 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433735 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433738 2571 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433740 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433746 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433749 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433752 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433756 2571 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433759 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433762 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433765 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433768 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433771 2571 flags.go:64] FLAG: --v="2" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433779 2571 flags.go:64] FLAG: --version="false" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433786 2571 flags.go:64] FLAG: --vmodule="" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433790 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.433794 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433889 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433892 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:07:55.438689 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433896 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433899 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433902 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433905 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433908 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433910 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433914 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433918 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433921 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433924 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433931 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433933 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433936 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433939 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433944 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433947 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433950 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433952 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433955 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433958 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:07:55.439273 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433961 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433964 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433966 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433969 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433972 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433975 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433979 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433981 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433984 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433987 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433989 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433992 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433994 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.433997 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434000 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434002 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434005 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434007 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434011 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434013 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:07:55.439846 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434016 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434018 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434021 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434024 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434027 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434029 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434033 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434035 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434038 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434041 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434044 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434046 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434049 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434052 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434054 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434057 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434060 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434062 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434066 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:07:55.440357 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434069 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434071 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434074 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434076 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434079 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434081 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434084 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434120 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434135 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434139 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434142 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434148 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434153 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434159 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434163 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434166 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434169 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434172 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434175 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434178 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:07:55.440918 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434181 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:07:55.441701 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434183 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:07:55.441701 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434186 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:07:55.441701 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434189 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:07:55.441701 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.434192 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:07:55.441701 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.434198 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:07:55.443276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.443251 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:07:55.443276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.443275 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443326 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443332 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443335 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443338 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443341 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443344 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443346 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443349 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443351 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443354 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443357 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443359 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443362 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443364 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443367 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443369 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443372 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443375 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443377 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:07:55.443377 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443380 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443383 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443386 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443388 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443391 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443396 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443400 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443403 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443407 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443409 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443412 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443415 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443418 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443420 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443423 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443426 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443428 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443431 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443434 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443436 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:07:55.443901 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443438 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443441 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443445 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443448 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443451 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443453 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443456 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443458 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443461 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443463 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443466 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443468 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443471 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443474 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443477 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443480 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443483 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443486 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443488 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:07:55.444403 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443491 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443494 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443496 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443499 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443502 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443505 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443507 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443510 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443512 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443515 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443518 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443520 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443522 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443525 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443528 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443530 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443533 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443536 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443539 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443542 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:07:55.444887 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443544 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443547 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443549 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443552 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443555 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443557 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443560 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443563 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.443569 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443665 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443669 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443672 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443675 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443678 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443681 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:07:55.445423 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443684 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443688 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443691 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443695 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443697 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443700 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443703 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443705 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443708 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443710 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443713 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443715 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443718 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443720 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443723 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443725 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443728 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443731 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443733 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:07:55.445805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443736 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443738 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443741 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443743 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443746 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443749 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443752 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443755 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443757 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443760 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443762 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443765 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443767 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443770 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443773 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443775 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443778 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443780 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443783 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443785 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:07:55.446391 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443788 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443790 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443793 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443795 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443798 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443800 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443803 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443805 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443807 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443810 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443815 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443818 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443821 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443823 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443826 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443828 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443831 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443834 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443837 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:07:55.446920 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443840 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443842 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443845 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443847 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443849 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443852 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443855 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443857 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443859 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443862 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443864 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443867 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443869 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443872 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443874 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443877 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443879 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443882 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443885 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443887 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:07:55.447416 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443890 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:55.443892 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.443897 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.444601 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.446705 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.447589 2571 server.go:1019] "Starting client certificate rotation" Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.447691 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:07:55.447911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.447737 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:07:55.473952 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.473926 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:07:55.479167 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.479144 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:07:55.492718 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.492703 2571 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:07:55.499560 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.499539 2571 log.go:25] "Validated CRI v1 image API" Apr 20 19:07:55.501546 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.501529 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:07:55.503674 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.503658 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:07:55.507629 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.507610 2571 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9b964b84-850a-4a95-88e7-439b3cc1f702:/dev/nvme0n1p3 d3c69712-3772-4b6e-9989-acafaf51b29a:/dev/nvme0n1p4] Apr 20 19:07:55.507688 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.507627 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:07:55.514232 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.514122 2571 manager.go:217] Machine: {Timestamp:2026-04-20 19:07:55.512188617 +0000 UTC m=+0.425684660 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3112250 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20b291add3d85771306bae0a731caa SystemUUID:ec20b291-add3-d857-7130-6bae0a731caa BootID:7fdbe3fd-23c4-4b66-8d5b-90c7bd74f106 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:e1:0a:4b:15 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:e1:0a:4b:15 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:d2:53:ec:e6:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:07:55.514863 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.514853 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:07:55.514953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.514938 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:07:55.518393 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.518364 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:07:55.518559 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.518396 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-63.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:07:55.518649 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.518570 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:07:55.518649 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.518582 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:07:55.518649 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.518600 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:07:55.519702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.519689 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:07:55.520541 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.520528 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:07:55.520670 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.520660 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:07:55.523178 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.523167 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:07:55.523241 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.523184 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:07:55.523241 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.523199 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:07:55.523241 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.523213 2571 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:07:55.523241 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.523226 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:07:55.524272 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.524259 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:07:55.524334 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.524281 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:07:55.527204 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.527184 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:07:55.528657 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.528643 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530635 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530652 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530658 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530663 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530669 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530675 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530681 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530686 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530694 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530700 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530708 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:07:55.530880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.530716 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:07:55.531600 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.531588 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:07:55.531600 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.531600 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:07:55.535133 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.535120 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:07:55.535211 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.535153 2571 server.go:1295] "Started kubelet" Apr 20 19:07:55.535312 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.535260 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:07:55.535480 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.535435 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:07:55.535565 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.535495 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:07:55.535967 ip-10-0-134-63 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:07:55.536482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.536442 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:07:55.536555 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.536537 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:07:55.536610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.536577 2571 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-63.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:07:55.536740 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.536598 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-63.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:07:55.538412 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.538396 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:07:55.542422 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.541590 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-63.ec2.internal.18a82638226b2082 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-63.ec2.internal,UID:ip-10-0-134-63.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-63.ec2.internal,},FirstTimestamp:2026-04-20 19:07:55.535130754 +0000 UTC m=+0.448626797,LastTimestamp:2026-04-20 19:07:55.535130754 +0000 UTC m=+0.448626797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-63.ec2.internal,}" Apr 20 19:07:55.542686 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.542665 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:07:55.542808 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.542671 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:07:55.543359 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543282 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:07:55.543511 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543480 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:07:55.543511 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543503 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:07:55.543652 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.543603 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:55.543652 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543628 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:07:55.543652 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543636 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:07:55.543808 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543766 2571 factory.go:55] Registering systemd factory Apr 20 19:07:55.543808 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.543805 2571 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:07:55.544024 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.544004 2571 factory.go:153] Registering CRI-O factory Apr 20 19:07:55.544024 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.544022 2571 factory.go:223] Registration of the crio container factory successfully Apr 20 19:07:55.544178 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.544073 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:07:55.544178 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.544091 2571 factory.go:103] Registering Raw factory Apr 20 19:07:55.544774 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.544758 2571 manager.go:1196] Started watching for new ooms in manager Apr 20 19:07:55.545076 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.545019 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:07:55.545195 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.545187 2571 manager.go:319] Starting recovery of all containers Apr 20 19:07:55.547967 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.547943 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cpq49" Apr 20 19:07:55.549264 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.549237 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 19:07:55.549385 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.549364 2571 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-63.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 19:07:55.556055 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.555896 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-cpq49" Apr 20 19:07:55.556247 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.556229 2571 manager.go:324] Recovery completed Apr 20 19:07:55.560418 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.560405 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:07:55.563061 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.563046 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:07:55.563144 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.563074 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:07:55.563144 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.563085 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:07:55.563582 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.563567 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:07:55.563582 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.563582 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:07:55.563650 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.563596 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:07:55.564925 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.564861 2571 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-63.ec2.internal.18a8263824154f82 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-63.ec2.internal,UID:ip-10-0-134-63.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-63.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-63.ec2.internal,},FirstTimestamp:2026-04-20 19:07:55.563061122 +0000 UTC m=+0.476557165,LastTimestamp:2026-04-20 19:07:55.563061122 +0000 UTC m=+0.476557165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-63.ec2.internal,}" Apr 20 19:07:55.566234 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.566221 2571 policy_none.go:49] "None policy: Start" Apr 20 19:07:55.566314 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.566239 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:07:55.566314 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.566254 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:07:55.615669 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.615650 2571 manager.go:341] "Starting Device Plugin manager" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.615684 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.615698 2571 server.go:85] "Starting device plugin registration server" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.615963 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.615976 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.616072 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.616188 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.616199 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.616672 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:07:55.618589 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.616705 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:55.685192 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.685102 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:07:55.686326 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.686308 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:07:55.686426 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.686348 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:07:55.686426 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.686379 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:07:55.686426 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.686389 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:07:55.686568 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.686430 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:07:55.690318 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.690296 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:07:55.716771 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.716747 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:07:55.717825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.717809 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:07:55.717904 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.717840 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:07:55.717904 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.717849 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:07:55.717904 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.717871 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.725789 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.725774 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.725838 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.725797 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-63.ec2.internal\": node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:55.747421 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.747401 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:55.786885 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.786854 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal"] Apr 20 19:07:55.786957 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.786928 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:07:55.788158 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.788143 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:07:55.788212 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.788171 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:07:55.788212 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.788181 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:07:55.789535 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.789523 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:07:55.789708 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.789694 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.789744 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.789736 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:07:55.790451 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.790436 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:07:55.790505 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.790463 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:07:55.790505 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.790473 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:07:55.790584 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.790441 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:07:55.790584 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.790537 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:07:55.790584 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.790547 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:07:55.791723 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.791700 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.791765 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.791755 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:07:55.792962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.792932 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:07:55.793032 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.792973 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:07:55.793032 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.792982 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:07:55.807758 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.807733 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-63.ec2.internal\" not found" node="ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.811131 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.811096 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-63.ec2.internal\" not found" node="ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.844821 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.844795 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7a01cbe8dec40e6d4400b545d6083218-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal\" (UID: \"7a01cbe8dec40e6d4400b545d6083218\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.844924 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.844826 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a01cbe8dec40e6d4400b545d6083218-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal\" (UID: \"7a01cbe8dec40e6d4400b545d6083218\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.844924 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.844887 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/500ebaba3a201fbd9b46f7798d1de76f-config\") pod \"kube-apiserver-proxy-ip-10-0-134-63.ec2.internal\" (UID: \"500ebaba3a201fbd9b46f7798d1de76f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.848199 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.848180 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:55.945478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.945383 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/500ebaba3a201fbd9b46f7798d1de76f-config\") pod \"kube-apiserver-proxy-ip-10-0-134-63.ec2.internal\" (UID: \"500ebaba3a201fbd9b46f7798d1de76f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.945478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.945437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7a01cbe8dec40e6d4400b545d6083218-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal\" (UID: \"7a01cbe8dec40e6d4400b545d6083218\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.945478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.945467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a01cbe8dec40e6d4400b545d6083218-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal\" (UID: \"7a01cbe8dec40e6d4400b545d6083218\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.945478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.945385 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/500ebaba3a201fbd9b46f7798d1de76f-config\") pod \"kube-apiserver-proxy-ip-10-0-134-63.ec2.internal\" (UID: \"500ebaba3a201fbd9b46f7798d1de76f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.945725 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.945500 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7a01cbe8dec40e6d4400b545d6083218-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal\" (UID: \"7a01cbe8dec40e6d4400b545d6083218\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.945725 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:55.945554 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7a01cbe8dec40e6d4400b545d6083218-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal\" (UID: \"7a01cbe8dec40e6d4400b545d6083218\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:55.948453 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:55.948436 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:56.049256 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:56.049223 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:56.110459 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.110421 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:56.114103 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.114088 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" Apr 20 19:07:56.149969 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:56.149940 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:56.250598 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:56.250521 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:56.351093 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:56.351061 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:56.447582 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.447548 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:07:56.448252 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.447715 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:07:56.451750 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:56.451734 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-63.ec2.internal\" not found" Apr 20 19:07:56.531428 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.531356 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:07:56.543479 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.543455 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" Apr 20 19:07:56.543603 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.543494 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:07:56.555264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.555243 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:07:56.558514 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.558403 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:02:55 +0000 UTC" deadline="2027-11-25 21:58:04.48902498 +0000 UTC" Apr 20 19:07:56.558514 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.558438 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14018h50m7.930590166s" Apr 20 19:07:56.560657 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.560623 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:07:56.562665 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.562651 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" Apr 20 19:07:56.571892 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.571871 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:07:56.576118 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.576087 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:07:56.576336 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.576318 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8dqrq" Apr 20 19:07:56.583895 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.583875 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8dqrq" Apr 20 19:07:56.591444 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:56.591421 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500ebaba3a201fbd9b46f7798d1de76f.slice/crio-f6c3c5e6353a5ecc0165e4fd568f21444b92f96f429df51b12a2da3b110421c2 WatchSource:0}: Error finding container f6c3c5e6353a5ecc0165e4fd568f21444b92f96f429df51b12a2da3b110421c2: Status 404 returned error can't find the container with id f6c3c5e6353a5ecc0165e4fd568f21444b92f96f429df51b12a2da3b110421c2 Apr 20 19:07:56.591854 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:56.591835 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a01cbe8dec40e6d4400b545d6083218.slice/crio-b411704d9c48b1352303a3b4f5ffac5655373333d5a40f7549b61d9a9ae7a5c0 WatchSource:0}: Error finding container b411704d9c48b1352303a3b4f5ffac5655373333d5a40f7549b61d9a9ae7a5c0: Status 404 returned error can't find the container with id b411704d9c48b1352303a3b4f5ffac5655373333d5a40f7549b61d9a9ae7a5c0 Apr 20 19:07:56.595802 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.595787 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:07:56.689942 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.689888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" event={"ID":"7a01cbe8dec40e6d4400b545d6083218","Type":"ContainerStarted","Data":"b411704d9c48b1352303a3b4f5ffac5655373333d5a40f7549b61d9a9ae7a5c0"} Apr 20 19:07:56.690742 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.690721 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" event={"ID":"500ebaba3a201fbd9b46f7798d1de76f","Type":"ContainerStarted","Data":"f6c3c5e6353a5ecc0165e4fd568f21444b92f96f429df51b12a2da3b110421c2"} Apr 20 19:07:56.831314 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:56.831236 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:07:57.525228 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.525194 2571 apiserver.go:52] "Watching apiserver" Apr 20 19:07:57.536619 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.536592 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:07:57.537101 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.537069 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-s26tp","kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t","openshift-cluster-node-tuning-operator/tuned-gf7wk","openshift-dns/node-resolver-k97mn","openshift-image-registry/node-ca-jq7q4","openshift-network-operator/iptables-alerter-4d89g","openshift-ovn-kubernetes/ovnkube-node-j8mkl","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal","openshift-multus/multus-additional-cni-plugins-kj7t2","openshift-multus/multus-pzns4","openshift-multus/network-metrics-daemon-ff5pq","openshift-network-diagnostics/network-check-target-2qjll"] Apr 20 19:07:57.539601 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.539156 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.540412 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.540390 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.541945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.541833 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.541945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.541848 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:07:57.542094 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.541947 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-rwvq8\"" Apr 20 19:07:57.543223 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.542705 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.543223 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.542782 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.543223 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.542925 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.543480 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.543354 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:07:57.543664 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.543600 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-44k7p\"" Apr 20 19:07:57.543664 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.543644 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.544746 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.544725 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.545732 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.545713 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-crld4\"" Apr 20 19:07:57.546007 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.545953 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.546232 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.546162 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.546462 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.546439 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.546966 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.546947 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:07:57.548483 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.548460 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.549187 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.548952 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.550778 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.550757 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j2hm6\"" Apr 20 19:07:57.550867 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.550825 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:07:57.550926 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.550901 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:07:57.551271 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551244 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:07:57.551962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551370 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jzjqs\"" Apr 20 19:07:57.551962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551478 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:07:57.551962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551548 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.551962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551623 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.551962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551645 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.551962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.551920 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:07:57.552520 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.552503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4thbx\"" Apr 20 19:07:57.552692 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.552673 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:07:57.553606 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553585 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.553755 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553725 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:57.553834 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553780 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-host\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.553834 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a841918c-2c8e-484e-a15b-de63708e31b4-tmp\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.553937 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.553825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:07:57.553937 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-sys-fs\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.553937 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfdwh\" (UniqueName: \"kubernetes.io/projected/6afa7914-d2d0-4077-b293-73873dd1cb3e-kube-api-access-vfdwh\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.554077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-iptables-alerter-script\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.554077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553966 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-wgvsw\"" Apr 20 19:07:57.554077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.553982 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-socket-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.554077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-registration-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.554077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554045 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-device-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.554077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554077 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7kqk\" (UniqueName: \"kubernetes.io/projected/a841918c-2c8e-484e-a15b-de63708e31b4-kube-api-access-z7kqk\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554374 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554131 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6afa7914-d2d0-4077-b293-73873dd1cb3e-serviceca\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.554374 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-sys\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554374 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554239 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-var-lib-kubelet\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554374 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554315 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a841918c-2c8e-484e-a15b-de63708e31b4-etc-tuned\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554557 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554385 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8r9q\" (UniqueName: \"kubernetes.io/projected/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-kube-api-access-l8r9q\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.554557 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.554557 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.554557 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554535 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-host-slash\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.554703 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d482d\" (UniqueName: \"kubernetes.io/projected/6792f0ef-d066-4526-9980-ddeabc8b23cb-kube-api-access-d482d\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.554703 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554656 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b3d91ffb-3895-4e12-a9f2-4d614bd77c3e-agent-certs\") pod \"konnectivity-agent-s26tp\" (UID: \"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e\") " pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.554703 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysctl-d\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554839 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysctl-conf\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554839 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b3d91ffb-3895-4e12-a9f2-4d614bd77c3e-konnectivity-ca\") pod \"konnectivity-agent-s26tp\" (UID: \"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e\") " pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.554839 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554820 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-modprobe-d\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554984 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554849 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-lib-modules\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.554984 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.554878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysconfig\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.555224 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.555202 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-kubernetes\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.555313 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.555240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-systemd\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.555313 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.555263 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-run\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.555313 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.555284 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6afa7914-d2d0-4077-b293-73873dd1cb3e-host\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.556039 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.556004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:07:57.556185 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.556133 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:07:57.557458 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.557392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.557458 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.557412 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.560825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.560578 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.560825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.560576 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:07:57.560825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.560583 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:07:57.560825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.560739 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6lv2k\"" Apr 20 19:07:57.561912 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.561854 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.561912 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.561901 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-45mgx\"" Apr 20 19:07:57.562229 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.562086 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:07:57.562409 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.562393 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:07:57.562754 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.562551 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:07:57.562928 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.562912 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:07:57.584571 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.584543 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:02:56 +0000 UTC" deadline="2027-10-27 05:40:38.669715929 +0000 UTC" Apr 20 19:07:57.584682 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.584581 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13306h32m41.085148068s" Apr 20 19:07:57.644696 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.644665 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:07:57.656012 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.655952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a841918c-2c8e-484e-a15b-de63708e31b4-etc-tuned\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.656012 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.655999 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-netns\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d482d\" (UniqueName: \"kubernetes.io/projected/6792f0ef-d066-4526-9980-ddeabc8b23cb-kube-api-access-d482d\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656140 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-cni-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-k8s-cni-cncf-io\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656172 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656208 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-env-overrides\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-os-release\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.656289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysctl-conf\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656304 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-kubelet\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656347 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656369 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-ovn\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656386 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-system-cni-dir\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656437 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b3d91ffb-3895-4e12-a9f2-4d614bd77c3e-konnectivity-ca\") pod \"konnectivity-agent-s26tp\" (UID: \"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e\") " pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656461 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-node-log\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656505 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-cni-bin\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656552 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-cni-netd\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysconfig\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-run\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-host\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656686 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-cni-multus\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.656792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-slash\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656880 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysconfig\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656892 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysctl-conf\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656927 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-run\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656955 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-host\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656959 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.656991 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-hostroot\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657053 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-etc-kubernetes\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657089 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-kubelet\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657139 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cni-binary-copy\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657203 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-hosts-file\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657248 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-tmp-dir\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-iptables-alerter-script\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657331 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b3d91ffb-3895-4e12-a9f2-4d614bd77c3e-konnectivity-ca\") pod \"konnectivity-agent-s26tp\" (UID: \"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e\") " pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-registration-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6afa7914-d2d0-4077-b293-73873dd1cb3e-serviceca\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.657482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657459 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-socket-dir-parent\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657488 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8r9q\" (UniqueName: \"kubernetes.io/projected/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-kube-api-access-l8r9q\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657558 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-systemd-units\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657701 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-systemd\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657730 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cnibin\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657772 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-host-slash\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-log-socket\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657827 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657857 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b3d91ffb-3895-4e12-a9f2-4d614bd77c3e-agent-certs\") pod \"konnectivity-agent-s26tp\" (UID: \"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e\") " pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysctl-d\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-iptables-alerter-script\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-run-netns\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657973 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-ovnkube-config\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.657993 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-registration-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.658265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658001 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-ovnkube-script-lib\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658052 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fst4\" (UniqueName: \"kubernetes.io/projected/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-kube-api-access-2fst4\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-host-slash\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-modprobe-d\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658128 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-lib-modules\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658160 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-multus-certs\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658210 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4csjl\" (UniqueName: \"kubernetes.io/projected/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-kube-api-access-4csjl\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-kubernetes\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658276 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-systemd\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6afa7914-d2d0-4077-b293-73873dd1cb3e-host\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658331 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-conf-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658377 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-daemon-config\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658403 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d80f226-e162-44b2-8b76-4d5f89d97859-ovn-node-metrics-cert\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658430 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2czx\" (UniqueName: \"kubernetes.io/projected/6d80f226-e162-44b2-8b76-4d5f89d97859-kube-api-access-c2czx\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a841918c-2c8e-484e-a15b-de63708e31b4-tmp\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6afa7914-d2d0-4077-b293-73873dd1cb3e-serviceca\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658515 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-cnibin\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.658971 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658549 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-os-release\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658575 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rzxf\" (UniqueName: \"kubernetes.io/projected/cc31ab16-2946-4d9a-baee-c02a00b73aae-kube-api-access-5rzxf\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658612 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-sys-fs\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658663 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfdwh\" (UniqueName: \"kubernetes.io/projected/6afa7914-d2d0-4077-b293-73873dd1cb3e-kube-api-access-vfdwh\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-system-cni-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-etc-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.658872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-socket-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-device-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659472 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7kqk\" (UniqueName: \"kubernetes.io/projected/a841918c-2c8e-484e-a15b-de63708e31b4-kube-api-access-z7kqk\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659500 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/125940c7-e0e5-43b5-a864-11cb9ced899b-cni-binary-copy\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659534 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-cni-bin\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659561 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zxdk\" (UniqueName: \"kubernetes.io/projected/125940c7-e0e5-43b5-a864-11cb9ced899b-kube-api-access-8zxdk\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659584 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-var-lib-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-sys\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659635 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-var-lib-kubelet\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.659837 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.659743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-var-lib-kubelet\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-etc-selinux\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660262 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-device-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660272 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-sys\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660312 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-sys-fs\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6792f0ef-d066-4526-9980-ddeabc8b23cb-socket-dir\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-sysctl-d\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660670 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-modprobe-d\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-kubernetes\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660739 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-lib-modules\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660768 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a841918c-2c8e-484e-a15b-de63708e31b4-etc-systemd\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.660736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.660779 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6afa7914-d2d0-4077-b293-73873dd1cb3e-host\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.662691 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.662667 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b3d91ffb-3895-4e12-a9f2-4d614bd77c3e-agent-certs\") pod \"konnectivity-agent-s26tp\" (UID: \"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e\") " pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.665166 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.663051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a841918c-2c8e-484e-a15b-de63708e31b4-tmp\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.665166 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.663069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a841918c-2c8e-484e-a15b-de63708e31b4-etc-tuned\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.672978 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.672957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8r9q\" (UniqueName: \"kubernetes.io/projected/2fb0b1f7-00e8-4e8a-bab0-49d08606cf30-kube-api-access-l8r9q\") pod \"iptables-alerter-4d89g\" (UID: \"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30\") " pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.673539 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.673517 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d482d\" (UniqueName: \"kubernetes.io/projected/6792f0ef-d066-4526-9980-ddeabc8b23cb-kube-api-access-d482d\") pod \"aws-ebs-csi-driver-node-2sc4t\" (UID: \"6792f0ef-d066-4526-9980-ddeabc8b23cb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.676753 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.676730 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfdwh\" (UniqueName: \"kubernetes.io/projected/6afa7914-d2d0-4077-b293-73873dd1cb3e-kube-api-access-vfdwh\") pod \"node-ca-jq7q4\" (UID: \"6afa7914-d2d0-4077-b293-73873dd1cb3e\") " pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.679312 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.679282 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7kqk\" (UniqueName: \"kubernetes.io/projected/a841918c-2c8e-484e-a15b-de63708e31b4-kube-api-access-z7kqk\") pod \"tuned-gf7wk\" (UID: \"a841918c-2c8e-484e-a15b-de63708e31b4\") " pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.760793 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-system-cni-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.760793 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760800 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-etc-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/125940c7-e0e5-43b5-a864-11cb9ced899b-cni-binary-copy\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-cni-bin\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zxdk\" (UniqueName: \"kubernetes.io/projected/125940c7-e0e5-43b5-a864-11cb9ced899b-kube-api-access-8zxdk\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-var-lib-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760882 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-etc-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760901 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-netns\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760917 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-cni-bin\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-netns\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760945 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760884 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-system-cni-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.760991 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-cni-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761010 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-k8s-cni-cncf-io\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761068 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-env-overrides\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-os-release\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761143 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-kubelet\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761163 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-os-release\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-ovn\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-system-cni-dir\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-system-cni-dir\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761435 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-cni-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761443 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-node-log\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761457 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-kubelet\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-cni-bin\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-ovn\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761483 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761511 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-cni-bin\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.761679 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761517 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-var-lib-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761524 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-node-log\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761525 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/125940c7-e0e5-43b5-a864-11cb9ced899b-cni-binary-copy\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761531 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-k8s-cni-cncf-io\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761545 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-cni-netd\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761570 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761580 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-cni-netd\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761614 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-cni-multus\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-env-overrides\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761650 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-var-lib-cni-multus\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761692 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-slash\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-hostroot\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-etc-kubernetes\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761822 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-kubelet\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-hostroot\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761846 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.761882 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:07:57.762455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-openvswitch\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-etc-kubernetes\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.761952 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:07:58.261928549 +0000 UTC m=+3.175424600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-kubelet\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761978 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-slash\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.761882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cni-binary-copy\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762018 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-hosts-file\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762044 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-tmp-dir\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762073 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-socket-dir-parent\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-systemd-units\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-systemd\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-socket-dir-parent\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cnibin\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762202 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-hosts-file\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-log-socket\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-run-systemd\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cnibin\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-systemd-units\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762267 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762276 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-log-socket\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762298 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-run-netns\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762318 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cni-binary-copy\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762323 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-ovnkube-config\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-ovnkube-script-lib\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762359 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6d80f226-e162-44b2-8b76-4d5f89d97859-host-run-netns\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fst4\" (UniqueName: \"kubernetes.io/projected/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-kube-api-access-2fst4\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-tmp-dir\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-multus-certs\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762425 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4csjl\" (UniqueName: \"kubernetes.io/projected/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-kube-api-access-4csjl\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-conf-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762508 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-daemon-config\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762512 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-host-run-multus-certs\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d80f226-e162-44b2-8b76-4d5f89d97859-ovn-node-metrics-cert\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.763856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762571 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2czx\" (UniqueName: \"kubernetes.io/projected/6d80f226-e162-44b2-8b76-4d5f89d97859-kube-api-access-c2czx\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-cnibin\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762641 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762653 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-os-release\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rzxf\" (UniqueName: \"kubernetes.io/projected/cc31ab16-2946-4d9a-baee-c02a00b73aae-kube-api-access-5rzxf\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762707 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-conf-dir\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762751 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-os-release\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/125940c7-e0e5-43b5-a864-11cb9ced899b-cnibin\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762801 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-ovnkube-config\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.762936 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6d80f226-e162-44b2-8b76-4d5f89d97859-ovnkube-script-lib\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.764407 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.763223 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/125940c7-e0e5-43b5-a864-11cb9ced899b-multus-daemon-config\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.764926 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.764908 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6d80f226-e162-44b2-8b76-4d5f89d97859-ovn-node-metrics-cert\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.769099 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.769077 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:07:57.769099 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.769115 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:07:57.769281 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.769130 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:07:57.769281 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:57.769202 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:07:58.269183384 +0000 UTC m=+3.182679435 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:07:57.771449 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.771408 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zxdk\" (UniqueName: \"kubernetes.io/projected/125940c7-e0e5-43b5-a864-11cb9ced899b-kube-api-access-8zxdk\") pod \"multus-pzns4\" (UID: \"125940c7-e0e5-43b5-a864-11cb9ced899b\") " pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.771619 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.771598 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fst4\" (UniqueName: \"kubernetes.io/projected/b0d252f8-fb8c-486c-aaa7-1197a96b6cfd-kube-api-access-2fst4\") pod \"multus-additional-cni-plugins-kj7t2\" (UID: \"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd\") " pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.771929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.771909 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rzxf\" (UniqueName: \"kubernetes.io/projected/cc31ab16-2946-4d9a-baee-c02a00b73aae-kube-api-access-5rzxf\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:57.772153 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.772130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2czx\" (UniqueName: \"kubernetes.io/projected/6d80f226-e162-44b2-8b76-4d5f89d97859-kube-api-access-c2czx\") pod \"ovnkube-node-j8mkl\" (UID: \"6d80f226-e162-44b2-8b76-4d5f89d97859\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.772249 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.772206 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4csjl\" (UniqueName: \"kubernetes.io/projected/79e9d9f2-39b0-4a1d-9e82-98ceca85b745-kube-api-access-4csjl\") pod \"node-resolver-k97mn\" (UID: \"79e9d9f2-39b0-4a1d-9e82-98ceca85b745\") " pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.855873 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.855770 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4d89g" Apr 20 19:07:57.863620 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.863599 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" Apr 20 19:07:57.872333 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.872311 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" Apr 20 19:07:57.878356 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.878334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:07:57.885977 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.885948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" Apr 20 19:07:57.893631 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.893605 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pzns4" Apr 20 19:07:57.901182 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.901162 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jq7q4" Apr 20 19:07:57.907688 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.907671 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k97mn" Apr 20 19:07:57.913237 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.913218 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:07:57.951495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:57.951463 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:07:58.266570 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.266532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:58.266721 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:58.266694 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:07:58.266784 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:58.266769 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:07:59.266745001 +0000 UTC m=+4.180241052 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:07:58.367444 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.367408 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:07:58.367630 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:58.367565 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:07:58.367630 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:58.367585 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:07:58.367630 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:58.367600 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:07:58.367773 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:58.367666 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:07:59.367645565 +0000 UTC m=+4.281141595 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:07:58.427889 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.427680 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod125940c7_e0e5_43b5_a864_11cb9ced899b.slice/crio-4a7c0d7574c0431e6062daeb16ad30116aec38f03dcf21e51db6da44967d70b6 WatchSource:0}: Error finding container 4a7c0d7574c0431e6062daeb16ad30116aec38f03dcf21e51db6da44967d70b6: Status 404 returned error can't find the container with id 4a7c0d7574c0431e6062daeb16ad30116aec38f03dcf21e51db6da44967d70b6 Apr 20 19:07:58.428746 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.428722 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d252f8_fb8c_486c_aaa7_1197a96b6cfd.slice/crio-6393cc37d4c953315cbbb692ded7db0fa15d697f2d66f56d7a36a7889d6b43ca WatchSource:0}: Error finding container 6393cc37d4c953315cbbb692ded7db0fa15d697f2d66f56d7a36a7889d6b43ca: Status 404 returned error can't find the container with id 6393cc37d4c953315cbbb692ded7db0fa15d697f2d66f56d7a36a7889d6b43ca Apr 20 19:07:58.429409 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.429386 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda841918c_2c8e_484e_a15b_de63708e31b4.slice/crio-fd6ee438fa03e51482a640e968ad1bc3c93b3f79ad3b8dea9950ca407a285e8c WatchSource:0}: Error finding container fd6ee438fa03e51482a640e968ad1bc3c93b3f79ad3b8dea9950ca407a285e8c: Status 404 returned error can't find the container with id fd6ee438fa03e51482a640e968ad1bc3c93b3f79ad3b8dea9950ca407a285e8c Apr 20 19:07:58.432315 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.432291 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d91ffb_3895_4e12_a9f2_4d614bd77c3e.slice/crio-9697546cd7a6bb454e82753d73dc85b7687a410a2affe694d5ff411b11eec3a4 WatchSource:0}: Error finding container 9697546cd7a6bb454e82753d73dc85b7687a410a2affe694d5ff411b11eec3a4: Status 404 returned error can't find the container with id 9697546cd7a6bb454e82753d73dc85b7687a410a2affe694d5ff411b11eec3a4 Apr 20 19:07:58.433737 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.433720 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb0b1f7_00e8_4e8a_bab0_49d08606cf30.slice/crio-f8163d6f3e33e3ae894beca3d7741b306ebf4f35d58e7f6c6d3f3ba650b0a781 WatchSource:0}: Error finding container f8163d6f3e33e3ae894beca3d7741b306ebf4f35d58e7f6c6d3f3ba650b0a781: Status 404 returned error can't find the container with id f8163d6f3e33e3ae894beca3d7741b306ebf4f35d58e7f6c6d3f3ba650b0a781 Apr 20 19:07:58.435079 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.435049 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6afa7914_d2d0_4077_b293_73873dd1cb3e.slice/crio-9cc7e1acc443b6c932255c5e2e02c369268973f2a88e1b710ce8bee477e20222 WatchSource:0}: Error finding container 9cc7e1acc443b6c932255c5e2e02c369268973f2a88e1b710ce8bee477e20222: Status 404 returned error can't find the container with id 9cc7e1acc443b6c932255c5e2e02c369268973f2a88e1b710ce8bee477e20222 Apr 20 19:07:58.436007 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.435973 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e9d9f2_39b0_4a1d_9e82_98ceca85b745.slice/crio-1e5441f65239da0917f0fc11ffdc30b186cad80a3f1cafc612a82f1b06782167 WatchSource:0}: Error finding container 1e5441f65239da0917f0fc11ffdc30b186cad80a3f1cafc612a82f1b06782167: Status 404 returned error can't find the container with id 1e5441f65239da0917f0fc11ffdc30b186cad80a3f1cafc612a82f1b06782167 Apr 20 19:07:58.437044 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.437023 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6792f0ef_d066_4526_9980_ddeabc8b23cb.slice/crio-0c7a19a2d6fcc6bbc03c4cfea2758f7d8cf55d3ae40c05dc1dab3d1db6f851e7 WatchSource:0}: Error finding container 0c7a19a2d6fcc6bbc03c4cfea2758f7d8cf55d3ae40c05dc1dab3d1db6f851e7: Status 404 returned error can't find the container with id 0c7a19a2d6fcc6bbc03c4cfea2758f7d8cf55d3ae40c05dc1dab3d1db6f851e7 Apr 20 19:07:58.438284 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:07:58.438229 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d80f226_e162_44b2_8b76_4d5f89d97859.slice/crio-2e14e272dea5a6b79228a8c4fc4e240d82c3e919e1d1daff14a1943c45adbe3d WatchSource:0}: Error finding container 2e14e272dea5a6b79228a8c4fc4e240d82c3e919e1d1daff14a1943c45adbe3d: Status 404 returned error can't find the container with id 2e14e272dea5a6b79228a8c4fc4e240d82c3e919e1d1daff14a1943c45adbe3d Apr 20 19:07:58.585341 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.585305 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:02:56 +0000 UTC" deadline="2027-12-20 05:01:58.45613165 +0000 UTC" Apr 20 19:07:58.585341 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.585331 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14601h53m59.870802044s" Apr 20 19:07:58.695362 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.695330 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" event={"ID":"500ebaba3a201fbd9b46f7798d1de76f","Type":"ContainerStarted","Data":"9d09559f876af6e29b8ea2540071f52a2935aa60acd996cd2f8caf820c2e42e8"} Apr 20 19:07:58.696574 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.696546 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"2e14e272dea5a6b79228a8c4fc4e240d82c3e919e1d1daff14a1943c45adbe3d"} Apr 20 19:07:58.697660 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.697627 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" event={"ID":"6792f0ef-d066-4526-9980-ddeabc8b23cb","Type":"ContainerStarted","Data":"0c7a19a2d6fcc6bbc03c4cfea2758f7d8cf55d3ae40c05dc1dab3d1db6f851e7"} Apr 20 19:07:58.698831 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.698809 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s26tp" event={"ID":"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e","Type":"ContainerStarted","Data":"9697546cd7a6bb454e82753d73dc85b7687a410a2affe694d5ff411b11eec3a4"} Apr 20 19:07:58.699825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.699805 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerStarted","Data":"6393cc37d4c953315cbbb692ded7db0fa15d697f2d66f56d7a36a7889d6b43ca"} Apr 20 19:07:58.700749 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.700723 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k97mn" event={"ID":"79e9d9f2-39b0-4a1d-9e82-98ceca85b745","Type":"ContainerStarted","Data":"1e5441f65239da0917f0fc11ffdc30b186cad80a3f1cafc612a82f1b06782167"} Apr 20 19:07:58.701658 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.701641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jq7q4" event={"ID":"6afa7914-d2d0-4077-b293-73873dd1cb3e","Type":"ContainerStarted","Data":"9cc7e1acc443b6c932255c5e2e02c369268973f2a88e1b710ce8bee477e20222"} Apr 20 19:07:58.702508 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.702489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4d89g" event={"ID":"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30","Type":"ContainerStarted","Data":"f8163d6f3e33e3ae894beca3d7741b306ebf4f35d58e7f6c6d3f3ba650b0a781"} Apr 20 19:07:58.703922 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.703903 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" event={"ID":"a841918c-2c8e-484e-a15b-de63708e31b4","Type":"ContainerStarted","Data":"fd6ee438fa03e51482a640e968ad1bc3c93b3f79ad3b8dea9950ca407a285e8c"} Apr 20 19:07:58.704947 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.704929 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pzns4" event={"ID":"125940c7-e0e5-43b5-a864-11cb9ced899b","Type":"ContainerStarted","Data":"4a7c0d7574c0431e6062daeb16ad30116aec38f03dcf21e51db6da44967d70b6"} Apr 20 19:07:58.708423 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:58.708386 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-63.ec2.internal" podStartSLOduration=2.708375826 podStartE2EDuration="2.708375826s" podCreationTimestamp="2026-04-20 19:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:07:58.707958462 +0000 UTC m=+3.621454524" watchObservedRunningTime="2026-04-20 19:07:58.708375826 +0000 UTC m=+3.621871878" Apr 20 19:07:59.275306 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:59.274718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:59.275306 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.274866 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:07:59.275306 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.274934 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:08:01.274912093 +0000 UTC m=+6.188408130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:07:59.375997 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:59.375916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:07:59.376226 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.376197 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:07:59.376226 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.376219 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:07:59.376342 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.376231 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:07:59.376342 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.376290 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:01.376271371 +0000 UTC m=+6.289767412 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:07:59.690248 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:59.689551 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:07:59.690248 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.689676 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:07:59.690248 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:59.690046 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:07:59.690248 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:07:59.690165 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:07:59.721897 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:59.721188 2571 generic.go:358] "Generic (PLEG): container finished" podID="7a01cbe8dec40e6d4400b545d6083218" containerID="4292b8e1168e7b7fe67ab95a6051723d2c902598d0ff601d819697e6117b0628" exitCode=0 Apr 20 19:07:59.721897 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:07:59.721320 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" event={"ID":"7a01cbe8dec40e6d4400b545d6083218","Type":"ContainerDied","Data":"4292b8e1168e7b7fe67ab95a6051723d2c902598d0ff601d819697e6117b0628"} Apr 20 19:08:00.733797 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:00.733152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" event={"ID":"7a01cbe8dec40e6d4400b545d6083218","Type":"ContainerStarted","Data":"909956ce794569ddfc582209b95948687917c46ad851b4272afa0cdf787530eb"} Apr 20 19:08:01.293493 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:01.293444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:01.293663 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.293642 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:01.293741 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.293709 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:08:05.293689562 +0000 UTC m=+10.207185620 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:01.394653 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:01.394615 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:01.394829 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.394809 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:01.394906 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.394844 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:01.394906 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.394858 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:01.395004 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.394928 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:05.39490861 +0000 UTC m=+10.308404657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:01.686683 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:01.686649 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:01.686863 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.686799 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:01.687358 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:01.687309 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:01.687491 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:01.687421 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:03.686807 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:03.686773 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:03.687302 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:03.686939 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:03.687572 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:03.687545 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:03.687664 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:03.687650 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:05.329322 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:05.329184 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:05.329756 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.329333 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:05.329756 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.329406 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:08:13.329386681 +0000 UTC m=+18.242882723 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:05.430498 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:05.430407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:05.430677 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.430586 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:05.430677 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.430610 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:05.430677 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.430623 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:05.430824 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.430684 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:13.430662537 +0000 UTC m=+18.344158726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:05.687924 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:05.687436 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:05.687924 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.687557 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:05.687924 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:05.687653 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:05.687924 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:05.687785 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:07.687054 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:07.687016 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:07.687543 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:07.687030 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:07.687543 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:07.687149 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:07.687543 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:07.687231 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:09.687431 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:09.687394 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:09.687899 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:09.687394 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:09.687899 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:09.687509 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:09.687899 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:09.687574 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:11.687526 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:11.687490 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:11.687996 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:11.687539 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:11.687996 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:11.687623 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:11.687996 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:11.687729 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:12.087360 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.087261 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-63.ec2.internal" podStartSLOduration=16.087223041 podStartE2EDuration="16.087223041s" podCreationTimestamp="2026-04-20 19:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:08:00.748642884 +0000 UTC m=+5.662138934" watchObservedRunningTime="2026-04-20 19:08:12.087223041 +0000 UTC m=+17.000719072" Apr 20 19:08:12.088224 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.088203 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7dbl5"] Apr 20 19:08:12.134935 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.134896 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.135127 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:12.134991 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:12.177738 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.177699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.177892 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.177844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67f46846-12c3-4c76-a68a-b367e050cf51-kubelet-config\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.177892 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.177884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67f46846-12c3-4c76-a68a-b367e050cf51-dbus\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.279151 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.279097 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67f46846-12c3-4c76-a68a-b367e050cf51-kubelet-config\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.279349 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.279166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67f46846-12c3-4c76-a68a-b367e050cf51-dbus\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.279349 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.279213 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/67f46846-12c3-4c76-a68a-b367e050cf51-kubelet-config\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.279349 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.279221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.279349 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:12.279309 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:12.279605 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.279372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/67f46846-12c3-4c76-a68a-b367e050cf51-dbus\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.279605 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:12.279377 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret podName:67f46846-12c3-4c76-a68a-b367e050cf51 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:12.779358113 +0000 UTC m=+17.692854168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret") pod "global-pull-secret-syncer-7dbl5" (UID: "67f46846-12c3-4c76-a68a-b367e050cf51") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:12.782404 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:12.782361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:12.782834 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:12.782530 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:12.782834 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:12.782619 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret podName:67f46846-12c3-4c76-a68a-b367e050cf51 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:13.782597473 +0000 UTC m=+18.696093507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret") pod "global-pull-secret-syncer-7dbl5" (UID: "67f46846-12c3-4c76-a68a-b367e050cf51") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:13.387784 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:13.387748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:13.387937 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.387918 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:13.388017 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.388003 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.387981426 +0000 UTC m=+34.301477479 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:13.489047 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:13.489008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:13.489324 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.489186 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:13.489324 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.489206 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:13.489324 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.489216 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:13.489324 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.489268 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.489254897 +0000 UTC m=+34.402750927 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:13.687671 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:13.687637 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:13.687671 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:13.687659 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:13.687883 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:13.687681 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:13.687883 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.687770 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:13.687883 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.687865 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:13.688027 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.687957 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:13.790351 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:13.790317 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:13.790765 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.790466 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:13.790765 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:13.790529 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret podName:67f46846-12c3-4c76-a68a-b367e050cf51 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:15.790515513 +0000 UTC m=+20.704011543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret") pod "global-pull-secret-syncer-7dbl5" (UID: "67f46846-12c3-4c76-a68a-b367e050cf51") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:15.688609 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:15.687859 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:15.688609 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:15.687985 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:15.688609 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:15.688432 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:15.688609 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:15.688559 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:15.689305 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:15.689176 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:15.689305 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:15.689270 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:15.763845 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:15.763771 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" event={"ID":"a841918c-2c8e-484e-a15b-de63708e31b4","Type":"ContainerStarted","Data":"89dc5d89e17b11c44cbc4c6d7008d57967a31a98cab9689cfca32158b49c7bd2"} Apr 20 19:08:15.806309 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:15.806277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:15.806433 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:15.806416 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:15.806495 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:15.806478 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret podName:67f46846-12c3-4c76-a68a-b367e050cf51 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:19.806459147 +0000 UTC m=+24.719955198 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret") pod "global-pull-secret-syncer-7dbl5" (UID: "67f46846-12c3-4c76-a68a-b367e050cf51") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:16.767191 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.766925 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pzns4" event={"ID":"125940c7-e0e5-43b5-a864-11cb9ced899b","Type":"ContainerStarted","Data":"4b3c914548aca17961fa84994004fd4c05659d3e442c42450c49f10af2eadcfb"} Apr 20 19:08:16.769993 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.769969 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770448 2571 generic.go:358] "Generic (PLEG): container finished" podID="6d80f226-e162-44b2-8b76-4d5f89d97859" containerID="5cb7d678dd9e94a22b788d786b19bb4014be1db45f6b0264a1b52d5d3c3c12d6" exitCode=1 Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770541 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"faaed0351b163f90c928aaf7dbac563af95441420a407caac140c2c1de04e779"} Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770578 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"ab136251ea8cd12226164d377319071e883e54f389855f1ceb7c28f05a1fc2d3"} Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770596 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"237bbf3b2b877a6867312eab5bcb4418d3e6ef04bac73540871162cc6bf1c119"} Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"0e95477e387e60524c49cbb09066368ed68cab75bd98c8c77492f26cffe81984"} Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770620 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerDied","Data":"5cb7d678dd9e94a22b788d786b19bb4014be1db45f6b0264a1b52d5d3c3c12d6"} Apr 20 19:08:16.770672 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.770633 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"4f195c70c6b5fb11080e8e2a0a538911861531f2365f7f87f8ebca508fb46df4"} Apr 20 19:08:16.772044 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.772017 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" event={"ID":"6792f0ef-d066-4526-9980-ddeabc8b23cb","Type":"ContainerStarted","Data":"c853d8e4dc4480d302a69061459f412911a06e129e49e3d6a58c7848ab7b9060"} Apr 20 19:08:16.773526 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.773483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s26tp" event={"ID":"b3d91ffb-3895-4e12-a9f2-4d614bd77c3e","Type":"ContainerStarted","Data":"8be1a762f184e86c1696f86a00e96322779dfa13e42e2e9d0adc327babfaf4be"} Apr 20 19:08:16.775081 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.775061 2571 generic.go:358] "Generic (PLEG): container finished" podID="b0d252f8-fb8c-486c-aaa7-1197a96b6cfd" containerID="3b93e55fb344186058df716a48ff04b4eaae343f2330b7939aa8c6d156e8f4ea" exitCode=0 Apr 20 19:08:16.775202 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.775141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerDied","Data":"3b93e55fb344186058df716a48ff04b4eaae343f2330b7939aa8c6d156e8f4ea"} Apr 20 19:08:16.776926 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.776905 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k97mn" event={"ID":"79e9d9f2-39b0-4a1d-9e82-98ceca85b745","Type":"ContainerStarted","Data":"f065c66201c182b76012295cbeb55199c662cb6a9bb0ce438e97bcae2afc58d0"} Apr 20 19:08:16.778687 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.778661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jq7q4" event={"ID":"6afa7914-d2d0-4077-b293-73873dd1cb3e","Type":"ContainerStarted","Data":"72ccc785365ecbc5b39f986eb9b134eed3c8bfab320ea2f95f003c0ee1e26e46"} Apr 20 19:08:16.782843 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.782807 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pzns4" podStartSLOduration=4.608111732 podStartE2EDuration="21.782797136s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.430299272 +0000 UTC m=+3.343795309" lastFinishedPulling="2026-04-20 19:08:15.604984683 +0000 UTC m=+20.518480713" observedRunningTime="2026-04-20 19:08:16.782476858 +0000 UTC m=+21.695972911" watchObservedRunningTime="2026-04-20 19:08:16.782797136 +0000 UTC m=+21.696293188" Apr 20 19:08:16.796990 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.796949 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gf7wk" podStartSLOduration=4.6723681070000005 podStartE2EDuration="21.796938415s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.431320768 +0000 UTC m=+3.344816815" lastFinishedPulling="2026-04-20 19:08:15.555891084 +0000 UTC m=+20.469387123" observedRunningTime="2026-04-20 19:08:16.796732708 +0000 UTC m=+21.710228759" watchObservedRunningTime="2026-04-20 19:08:16.796938415 +0000 UTC m=+21.710434466" Apr 20 19:08:16.838652 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.838605 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s26tp" podStartSLOduration=4.716709743 podStartE2EDuration="21.838589066s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.433904153 +0000 UTC m=+3.347400182" lastFinishedPulling="2026-04-20 19:08:15.555783465 +0000 UTC m=+20.469279505" observedRunningTime="2026-04-20 19:08:16.813204515 +0000 UTC m=+21.726700566" watchObservedRunningTime="2026-04-20 19:08:16.838589066 +0000 UTC m=+21.752085096" Apr 20 19:08:16.855213 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.855170 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jq7q4" podStartSLOduration=4.73710308 podStartE2EDuration="21.855157227s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.437841634 +0000 UTC m=+3.351337668" lastFinishedPulling="2026-04-20 19:08:15.555895771 +0000 UTC m=+20.469391815" observedRunningTime="2026-04-20 19:08:16.854985204 +0000 UTC m=+21.768481257" watchObservedRunningTime="2026-04-20 19:08:16.855157227 +0000 UTC m=+21.768653278" Apr 20 19:08:16.875726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:16.875686 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k97mn" podStartSLOduration=3.724754056 podStartE2EDuration="20.875673393s" podCreationTimestamp="2026-04-20 19:07:56 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.438632768 +0000 UTC m=+3.352128798" lastFinishedPulling="2026-04-20 19:08:15.589552091 +0000 UTC m=+20.503048135" observedRunningTime="2026-04-20 19:08:16.875514391 +0000 UTC m=+21.789010481" watchObservedRunningTime="2026-04-20 19:08:16.875673393 +0000 UTC m=+21.789169445" Apr 20 19:08:17.301097 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.301074 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:08:17.627939 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.627787 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:08:17.301091837Z","UUID":"e02a3d62-01e7-4889-bfdf-6f634b78fd50","Handler":null,"Name":"","Endpoint":""} Apr 20 19:08:17.629733 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.629708 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:08:17.629875 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.629741 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:08:17.687054 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.686963 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:17.687319 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.686963 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:17.687319 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:17.687093 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:17.687319 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.686963 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:17.687319 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:17.687173 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:17.687319 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:17.687245 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:17.781671 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.781639 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4d89g" event={"ID":"2fb0b1f7-00e8-4e8a-bab0-49d08606cf30","Type":"ContainerStarted","Data":"ad34ea3584c1520f87715bdf446ab9b136c677582037f2ba23d77d3bd1b45c62"} Apr 20 19:08:17.783363 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:17.783304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" event={"ID":"6792f0ef-d066-4526-9980-ddeabc8b23cb","Type":"ContainerStarted","Data":"52c144683587100a5d09122ed96c326b360b578a380b3209cbe6fc683d443b82"} Apr 20 19:08:18.788306 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:18.788023 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:08:18.788709 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:18.788609 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"6eb2396c19fce0ca871c15c9b589c87f14e01802172664f9e3cd2ee80d950804"} Apr 20 19:08:18.977875 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:18.977845 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:08:18.978576 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:18.978551 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:08:18.993888 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:18.993839 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4d89g" podStartSLOduration=6.841157589 podStartE2EDuration="23.993822488s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.435467104 +0000 UTC m=+3.348963147" lastFinishedPulling="2026-04-20 19:08:15.588132001 +0000 UTC m=+20.501628046" observedRunningTime="2026-04-20 19:08:17.80395735 +0000 UTC m=+22.717453404" watchObservedRunningTime="2026-04-20 19:08:18.993822488 +0000 UTC m=+23.907318540" Apr 20 19:08:19.687024 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.686989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:19.687227 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.686993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:19.687227 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:19.687155 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:19.687227 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.686993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:19.687227 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:19.687215 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:19.687466 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:19.687290 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:19.792733 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.792698 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" event={"ID":"6792f0ef-d066-4526-9980-ddeabc8b23cb","Type":"ContainerStarted","Data":"cdbb5c88f5d4b1fe6f39ad926f68631b74a34b0e5e501d46c80edf7545c68d4c"} Apr 20 19:08:19.793383 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.793028 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:08:19.793444 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.793403 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s26tp" Apr 20 19:08:19.812917 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.812873 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2sc4t" podStartSLOduration=4.427857811 podStartE2EDuration="24.812860287s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.438919275 +0000 UTC m=+3.352415320" lastFinishedPulling="2026-04-20 19:08:18.823921766 +0000 UTC m=+23.737417796" observedRunningTime="2026-04-20 19:08:19.812338159 +0000 UTC m=+24.725834211" watchObservedRunningTime="2026-04-20 19:08:19.812860287 +0000 UTC m=+24.726356336" Apr 20 19:08:19.839494 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:19.839471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:19.839641 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:19.839615 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:19.839699 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:19.839688 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret podName:67f46846-12c3-4c76-a68a-b367e050cf51 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:27.839674068 +0000 UTC m=+32.753170098 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret") pod "global-pull-secret-syncer-7dbl5" (UID: "67f46846-12c3-4c76-a68a-b367e050cf51") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:21.687432 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.687254 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:21.687945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.687250 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:21.687945 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:21.687522 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:21.687945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.687270 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:21.687945 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:21.687579 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:21.687945 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:21.687680 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:21.799222 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.799196 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:08:21.799571 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.799547 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"119878c8d1cf8dcfcedd6f53d786f6db0edbeefead6012948d07ecc1734f0418"} Apr 20 19:08:21.799939 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.799915 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:08:21.800163 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.800143 2571 scope.go:117] "RemoveContainer" containerID="5cb7d678dd9e94a22b788d786b19bb4014be1db45f6b0264a1b52d5d3c3c12d6" Apr 20 19:08:21.801375 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.801354 2571 generic.go:358] "Generic (PLEG): container finished" podID="b0d252f8-fb8c-486c-aaa7-1197a96b6cfd" containerID="fa5bc9dce32f4ec972c3aed35c0313e365eb17ad1472456969920f89f6bb23d6" exitCode=0 Apr 20 19:08:21.801476 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.801382 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerDied","Data":"fa5bc9dce32f4ec972c3aed35c0313e365eb17ad1472456969920f89f6bb23d6"} Apr 20 19:08:21.817862 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:21.817814 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:08:22.807902 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.807838 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:08:22.808375 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.808293 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" event={"ID":"6d80f226-e162-44b2-8b76-4d5f89d97859","Type":"ContainerStarted","Data":"1c857f7a1ebf175210f04653e51c8024fbfc7f9fef66ec8d93b5e513fef7f2d2"} Apr 20 19:08:22.808726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.808685 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:08:22.808839 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.808744 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:08:22.810734 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.810713 2571 generic.go:358] "Generic (PLEG): container finished" podID="b0d252f8-fb8c-486c-aaa7-1197a96b6cfd" containerID="831e0fc89b4943e9d332fdb19094d3ffbe802d38a035e6fe3c7d7c81bdf5d789" exitCode=0 Apr 20 19:08:22.810843 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.810757 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerDied","Data":"831e0fc89b4943e9d332fdb19094d3ffbe802d38a035e6fe3c7d7c81bdf5d789"} Apr 20 19:08:22.824549 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.824521 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:08:22.838828 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:22.838785 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" podStartSLOduration=9.639233492 podStartE2EDuration="26.838772468s" podCreationTimestamp="2026-04-20 19:07:56 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.439701303 +0000 UTC m=+3.353197347" lastFinishedPulling="2026-04-20 19:08:15.639240291 +0000 UTC m=+20.552736323" observedRunningTime="2026-04-20 19:08:22.838239517 +0000 UTC m=+27.751735569" watchObservedRunningTime="2026-04-20 19:08:22.838772468 +0000 UTC m=+27.752268520" Apr 20 19:08:23.072506 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.072413 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff5pq"] Apr 20 19:08:23.072676 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.072570 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:23.072729 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:23.072695 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:23.073155 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.073132 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7dbl5"] Apr 20 19:08:23.073253 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.073242 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:23.073365 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:23.073344 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:23.073772 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.073752 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2qjll"] Apr 20 19:08:23.073860 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.073849 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:23.073962 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:23.073944 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:23.814532 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.814497 2571 generic.go:358] "Generic (PLEG): container finished" podID="b0d252f8-fb8c-486c-aaa7-1197a96b6cfd" containerID="e26a2b46edbd1c85fbc0f2702f470429c396004145166fe48ad1a523d839d718" exitCode=0 Apr 20 19:08:23.814905 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:23.814582 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerDied","Data":"e26a2b46edbd1c85fbc0f2702f470429c396004145166fe48ad1a523d839d718"} Apr 20 19:08:24.686893 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:24.686826 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:24.686893 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:24.686826 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:24.687041 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:24.686969 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:24.687041 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:24.686980 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:24.687187 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:24.687052 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:24.687187 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:24.687152 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:26.687457 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:26.687263 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:26.688167 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:26.687263 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:26.688167 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:26.687263 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:26.688167 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:26.687545 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:26.688167 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:26.687609 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:26.688167 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:26.687693 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:27.901731 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:27.901691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:27.902254 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:27.901858 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:27.902254 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:27.901926 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret podName:67f46846-12c3-4c76-a68a-b367e050cf51 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:43.901910262 +0000 UTC m=+48.815406291 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret") pod "global-pull-secret-syncer-7dbl5" (UID: "67f46846-12c3-4c76-a68a-b367e050cf51") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:08:28.687567 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.687534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:28.687790 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.687534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:28.687790 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:28.687676 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2qjll" podUID="838f1a66-5c9b-4d0f-90b0-35a81df852d0" Apr 20 19:08:28.687790 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.687534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:28.687790 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:28.687737 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:08:28.687990 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:28.687799 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7dbl5" podUID="67f46846-12c3-4c76-a68a-b367e050cf51" Apr 20 19:08:28.861182 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.861153 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-63.ec2.internal" event="NodeReady" Apr 20 19:08:28.861361 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.861313 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:08:28.896715 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.896678 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9d6cb88f9-f5dkx"] Apr 20 19:08:28.908056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.908022 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4hs99"] Apr 20 19:08:28.908600 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.908194 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:28.911696 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.911550 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:08:28.911696 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.911552 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:08:28.912841 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.911818 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:08:28.913448 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.913290 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w68rf\"" Apr 20 19:08:28.923667 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.923635 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4hs99"] Apr 20 19:08:28.923784 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.923682 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9d6cb88f9-f5dkx"] Apr 20 19:08:28.923784 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.923707 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7kxxk"] Apr 20 19:08:28.923888 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.923797 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:28.926587 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.926568 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 19:08:28.926723 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.926695 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 19:08:28.926824 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.926803 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8k5pc\"" Apr 20 19:08:28.927994 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.927975 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:08:28.933204 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.933182 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mwrz2"] Apr 20 19:08:28.933351 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.933334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:28.935905 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.935885 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:08:28.936028 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.936006 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:08:28.936091 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.936068 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ggf2q\"" Apr 20 19:08:28.941408 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.941316 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mwrz2"] Apr 20 19:08:28.941408 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.941342 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7kxxk"] Apr 20 19:08:28.941521 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.941492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:28.943997 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.943978 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:08:28.944101 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.944006 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:08:28.944101 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.944015 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gtxhr\"" Apr 20 19:08:28.944350 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:28.944333 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:08:29.009598 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-bound-sa-token\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009773 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009614 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65md9\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-kube-api-access-65md9\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009773 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-certificates\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009773 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009706 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-ca-trust-extracted\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009773 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009732 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-installation-pull-secrets\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-image-registry-private-configuration\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009804 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4955879-53ea-4f74-96dd-eae67dbbe030-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:29.009945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.009945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009860 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:29.009945 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.009883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-trusted-ca\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.110652 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110609 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65md9\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-kube-api-access-65md9\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.110838 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-certificates\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.110838 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-config-volume\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.110838 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110757 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-ca-trust-extracted\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.110838 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-installation-pull-secrets\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.110838 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110820 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.111092 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110891 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:29.111092 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-image-registry-private-configuration\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.111092 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110965 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4955879-53ea-4f74-96dd-eae67dbbe030-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:29.111092 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.110996 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-tmp-dir\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.111092 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.111092 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:29.111386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnwlh\" (UniqueName: \"kubernetes.io/projected/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-kube-api-access-jnwlh\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.111386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-trusted-ca\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.111386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-bound-sa-token\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.111386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111214 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8hl\" (UniqueName: \"kubernetes.io/projected/f543f92c-946e-4d67-ad5f-48be19c49af7-kube-api-access-bb8hl\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:29.111386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111300 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-ca-trust-extracted\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.111386 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.111323 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:08:29.111658 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.111399 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.611373765 +0000 UTC m=+34.524869795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:08:29.111658 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111402 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-certificates\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.111658 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.111510 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:08:29.111658 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.111528 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:08:29.111658 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.111592 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.611573895 +0000 UTC m=+34.525069938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:08:29.112011 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.111983 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d4955879-53ea-4f74-96dd-eae67dbbe030-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:29.112653 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.112622 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-trusted-ca\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.115470 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.115439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-installation-pull-secrets\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.115559 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.115470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-image-registry-private-configuration\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.120225 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.120182 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65md9\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-kube-api-access-65md9\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.132375 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.132349 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-bound-sa-token\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.212078 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.211993 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.212078 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:29.212306 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.212156 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:29.212306 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.212175 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:29.212306 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.212225 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.712203936 +0000 UTC m=+34.625699992 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:08:29.212306 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.212266 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:29.712245813 +0000 UTC m=+34.625741842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:08:29.212306 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212176 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-tmp-dir\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.212561 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212339 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnwlh\" (UniqueName: \"kubernetes.io/projected/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-kube-api-access-jnwlh\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.212561 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212385 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8hl\" (UniqueName: \"kubernetes.io/projected/f543f92c-946e-4d67-ad5f-48be19c49af7-kube-api-access-bb8hl\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:29.212561 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-config-volume\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.212561 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212485 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-tmp-dir\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.212921 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.212905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-config-volume\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.222823 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.222802 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8hl\" (UniqueName: \"kubernetes.io/projected/f543f92c-946e-4d67-ad5f-48be19c49af7-kube-api-access-bb8hl\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:29.228871 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.228847 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnwlh\" (UniqueName: \"kubernetes.io/projected/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-kube-api-access-jnwlh\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.414295 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.414252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:29.414452 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.414410 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:29.414496 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.414486 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:09:01.414471107 +0000 UTC m=+66.327967137 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:08:29.515515 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.515439 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:29.515653 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.515597 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:08:29.515653 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.515616 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:08:29.515653 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.515625 2571 projected.go:194] Error preparing data for projected volume kube-api-access-vcm6n for pod openshift-network-diagnostics/network-check-target-2qjll: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:29.515747 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.515672 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n podName:838f1a66-5c9b-4d0f-90b0-35a81df852d0 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:01.51565896 +0000 UTC m=+66.429155004 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vcm6n" (UniqueName: "kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n") pod "network-check-target-2qjll" (UID: "838f1a66-5c9b-4d0f-90b0-35a81df852d0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:08:29.616090 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.616055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:29.616243 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.616096 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:29.616243 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.616200 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:08:29.616308 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.616261 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:30.616248495 +0000 UTC m=+35.529744525 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:08:29.616308 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.616201 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:08:29.616308 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.616300 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:08:29.616400 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.616352 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:30.61633857 +0000 UTC m=+35.529834602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:08:29.717021 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.716984 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:29.717021 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:29.717023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:29.717228 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.717141 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:29.717228 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.717160 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:29.717228 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.717207 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:30.717188701 +0000 UTC m=+35.630684755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:08:29.717228 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:29.717222 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:30.71721579 +0000 UTC m=+35.630711820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:08:30.626677 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.626643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:30.626677 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.626677 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:30.627209 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.626775 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:08:30.627209 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.626795 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:08:30.627209 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.626811 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:08:30.627209 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.626826 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:32.626811806 +0000 UTC m=+37.540307836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:08:30.627209 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.626855 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:32.626841143 +0000 UTC m=+37.540337173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:08:30.687261 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.687227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:08:30.687261 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.687253 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:08:30.687473 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.687321 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:30.691345 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.691323 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:08:30.691461 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.691368 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:08:30.691461 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.691388 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:08:30.691461 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.691420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bdmvj\"" Apr 20 19:08:30.691461 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.691430 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wstmx\"" Apr 20 19:08:30.691461 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.691439 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:08:30.727237 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.727210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:30.727355 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.727241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:30.727396 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.727357 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:30.727396 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.727372 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:30.727457 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.727410 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:32.72739793 +0000 UTC m=+37.640893960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:08:30.727457 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:30.727422 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:32.727416506 +0000 UTC m=+37.640912536 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:08:30.828892 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.828859 2571 generic.go:358] "Generic (PLEG): container finished" podID="b0d252f8-fb8c-486c-aaa7-1197a96b6cfd" containerID="b429d636e75f7abf150a9c832653aac824202e32ba8b9fc353505c529f68fd14" exitCode=0 Apr 20 19:08:30.829049 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:30.828892 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerDied","Data":"b429d636e75f7abf150a9c832653aac824202e32ba8b9fc353505c529f68fd14"} Apr 20 19:08:31.833124 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:31.832929 2571 generic.go:358] "Generic (PLEG): container finished" podID="b0d252f8-fb8c-486c-aaa7-1197a96b6cfd" containerID="725a1b005fe531b536a2302eaacb64e1b336c1d81fdef34f33e64ad4664106d1" exitCode=0 Apr 20 19:08:31.833124 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:31.833014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerDied","Data":"725a1b005fe531b536a2302eaacb64e1b336c1d81fdef34f33e64ad4664106d1"} Apr 20 19:08:32.640829 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:32.640790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:32.640936 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:32.640843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:32.640982 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.640944 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:08:32.640982 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.640959 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:08:32.641052 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.641011 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:36.640992751 +0000 UTC m=+41.554488781 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:08:32.641052 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.641028 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:08:32.641148 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.641086 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:36.641069594 +0000 UTC m=+41.554565646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:08:32.741859 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:32.741820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:32.741859 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:32.741863 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:32.742065 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.741983 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:32.742065 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.742028 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:36.742015323 +0000 UTC m=+41.655511352 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:08:32.742065 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.741983 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:32.742065 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:32.742055 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:36.742050024 +0000 UTC m=+41.655546053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:08:32.837525 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:32.837495 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" event={"ID":"b0d252f8-fb8c-486c-aaa7-1197a96b6cfd","Type":"ContainerStarted","Data":"0b2cf92dd9b61b8f55ab0a0de7558acd8896529423739bcf603078e8391dfc19"} Apr 20 19:08:32.861478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:32.861389 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kj7t2" podStartSLOduration=6.456956858 podStartE2EDuration="37.86137534s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:07:58.431284088 +0000 UTC m=+3.344780120" lastFinishedPulling="2026-04-20 19:08:29.835702572 +0000 UTC m=+34.749198602" observedRunningTime="2026-04-20 19:08:32.859503803 +0000 UTC m=+37.772999855" watchObservedRunningTime="2026-04-20 19:08:32.86137534 +0000 UTC m=+37.774871391" Apr 20 19:08:36.669860 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:36.669797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:36.669860 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:36.669859 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:36.670367 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.669955 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:08:36.670367 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.669980 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:08:36.670367 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.670034 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:44.670018189 +0000 UTC m=+49.583514219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:08:36.670367 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.669969 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:08:36.670367 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.670127 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:44.670089566 +0000 UTC m=+49.583585612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:08:36.770862 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:36.770807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:36.771032 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.770944 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:36.771032 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.771005 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:44.770989906 +0000 UTC m=+49.684485936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:08:36.771158 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:36.771079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:36.771212 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.771165 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:36.771212 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:36.771195 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:08:44.771187254 +0000 UTC m=+49.684683283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:08:43.925077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:43.925035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:43.928098 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:43.928077 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/67f46846-12c3-4c76-a68a-b367e050cf51-original-pull-secret\") pod \"global-pull-secret-syncer-7dbl5\" (UID: \"67f46846-12c3-4c76-a68a-b367e050cf51\") " pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:44.203030 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.202948 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7dbl5" Apr 20 19:08:44.331444 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.331415 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7dbl5"] Apr 20 19:08:44.731464 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.731428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:08:44.731464 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.731467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:08:44.731791 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.731571 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:08:44.731791 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.731595 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:08:44.731791 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.731617 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:08:44.731791 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.731626 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:00.731609887 +0000 UTC m=+65.645105916 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:08:44.731791 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.731684 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:00.731663852 +0000 UTC m=+65.645159903 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:08:44.832262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.832230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:08:44.832262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.832266 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:08:44.832422 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.832382 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:08:44.832422 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.832383 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:08:44.832511 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.832431 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:00.832417832 +0000 UTC m=+65.745913861 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:08:44.832511 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:08:44.832442 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:00.832436951 +0000 UTC m=+65.745932981 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:08:44.861029 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:44.860999 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7dbl5" event={"ID":"67f46846-12c3-4c76-a68a-b367e050cf51","Type":"ContainerStarted","Data":"e015a7f1dc7c8aad37dd0547b2f59ed561e78e06e3cb8530c96a4820d0bfc38f"} Apr 20 19:08:50.874516 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:50.874475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7dbl5" event={"ID":"67f46846-12c3-4c76-a68a-b367e050cf51","Type":"ContainerStarted","Data":"53978c8b1b4e1dec52e7255b2f480c84bf2f444a6ee4f8a75f4dbf5ad3799113"} Apr 20 19:08:50.891540 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:50.891491 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7dbl5" podStartSLOduration=33.488893958 podStartE2EDuration="38.891475001s" podCreationTimestamp="2026-04-20 19:08:12 +0000 UTC" firstStartedPulling="2026-04-20 19:08:44.336046676 +0000 UTC m=+49.249542705" lastFinishedPulling="2026-04-20 19:08:49.738627712 +0000 UTC m=+54.652123748" observedRunningTime="2026-04-20 19:08:50.890699795 +0000 UTC m=+55.804195847" watchObservedRunningTime="2026-04-20 19:08:50.891475001 +0000 UTC m=+55.804971030" Apr 20 19:08:54.826177 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:08:54.826150 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8mkl" Apr 20 19:09:00.761234 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:00.761193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:09:00.761234 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:00.761235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:09:00.761750 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.761348 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:09:00.761750 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.761364 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:09:00.761750 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.761369 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:09:00.761750 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.761433 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:32.761413711 +0000 UTC m=+97.674909745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:09:00.761750 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.761450 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:32.761442486 +0000 UTC m=+97.674938516 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:09:00.862074 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:00.862038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:09:00.862074 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:00.862075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:09:00.862313 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.862213 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:00.862313 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.862217 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:00.862313 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.862264 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:32.862249898 +0000 UTC m=+97.775745928 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:09:00.862313 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:00.862277 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:09:32.862270923 +0000 UTC m=+97.775766953 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:09:01.467615 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.467583 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:09:01.470265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.470247 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:09:01.478438 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:01.478410 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:09:01.478562 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:01.478469 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:10:05.478454106 +0000 UTC m=+130.391950136 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : secret "metrics-daemon-secret" not found Apr 20 19:09:01.568714 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.568671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:09:01.571567 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.571547 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:09:01.582315 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.582288 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:09:01.594592 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.594567 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcm6n\" (UniqueName: \"kubernetes.io/projected/838f1a66-5c9b-4d0f-90b0-35a81df852d0-kube-api-access-vcm6n\") pod \"network-check-target-2qjll\" (UID: \"838f1a66-5c9b-4d0f-90b0-35a81df852d0\") " pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:09:01.600005 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.599977 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wstmx\"" Apr 20 19:09:01.608162 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.608144 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:09:01.723101 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.723033 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2qjll"] Apr 20 19:09:01.725795 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:09:01.725768 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod838f1a66_5c9b_4d0f_90b0_35a81df852d0.slice/crio-72a08498fd9e5c50a1105f34c71f18ecf044c9d91372be92ea698fe425e0396d WatchSource:0}: Error finding container 72a08498fd9e5c50a1105f34c71f18ecf044c9d91372be92ea698fe425e0396d: Status 404 returned error can't find the container with id 72a08498fd9e5c50a1105f34c71f18ecf044c9d91372be92ea698fe425e0396d Apr 20 19:09:01.897817 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:01.897789 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2qjll" event={"ID":"838f1a66-5c9b-4d0f-90b0-35a81df852d0","Type":"ContainerStarted","Data":"72a08498fd9e5c50a1105f34c71f18ecf044c9d91372be92ea698fe425e0396d"} Apr 20 19:09:04.905541 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:04.905505 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2qjll" event={"ID":"838f1a66-5c9b-4d0f-90b0-35a81df852d0","Type":"ContainerStarted","Data":"c6f7922ca2fdf857551b49541368a4a9e0b8697d47f269bb04eb2c8e624eeaba"} Apr 20 19:09:04.905915 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:04.905646 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:09:04.924120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:04.921820 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2qjll" podStartSLOduration=67.329857677 podStartE2EDuration="1m9.921804354s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:09:01.727788845 +0000 UTC m=+66.641284875" lastFinishedPulling="2026-04-20 19:09:04.319735522 +0000 UTC m=+69.233231552" observedRunningTime="2026-04-20 19:09:04.920545783 +0000 UTC m=+69.834041835" watchObservedRunningTime="2026-04-20 19:09:04.921804354 +0000 UTC m=+69.835300406" Apr 20 19:09:32.806276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:32.806220 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:09:32.806276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:32.806275 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:09:32.806741 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.806373 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:09:32.806741 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.806374 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:09:32.806741 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.806394 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9d6cb88f9-f5dkx: secret "image-registry-tls" not found Apr 20 19:09:32.806741 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.806429 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:36.806415323 +0000 UTC m=+161.719911356 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:09:32.806741 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.806449 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls podName:fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:36.806434728 +0000 UTC m=+161.719930757 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls") pod "image-registry-9d6cb88f9-f5dkx" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19") : secret "image-registry-tls" not found Apr 20 19:09:32.906969 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:32.906876 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:09:32.906969 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:32.906925 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:09:32.907185 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.907029 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:09:32.907185 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.907085 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert podName:f543f92c-946e-4d67-ad5f-48be19c49af7 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:36.907071557 +0000 UTC m=+161.820567587 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert") pod "ingress-canary-mwrz2" (UID: "f543f92c-946e-4d67-ad5f-48be19c49af7") : secret "canary-serving-cert" not found Apr 20 19:09:32.907185 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.907029 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:09:32.907185 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:09:32.907164 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls podName:4937ec4e-cf01-4c28-aa08-5a8a7d722ff9 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:36.907152237 +0000 UTC m=+161.820648271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls") pod "dns-default-7kxxk" (UID: "4937ec4e-cf01-4c28-aa08-5a8a7d722ff9") : secret "dns-default-metrics-tls" not found Apr 20 19:09:35.910455 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:09:35.910428 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2qjll" Apr 20 19:10:05.534293 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:05.534235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:10:05.534787 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:05.534401 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:10:05.534787 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:05.534484 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs podName:cc31ab16-2946-4d9a-baee-c02a00b73aae nodeName:}" failed. No retries permitted until 2026-04-20 19:12:07.534466605 +0000 UTC m=+252.447962635 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs") pod "network-metrics-daemon-ff5pq" (UID: "cc31ab16-2946-4d9a-baee-c02a00b73aae") : secret "metrics-daemon-secret" not found Apr 20 19:10:09.113541 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.113506 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-slmw8"] Apr 20 19:10:09.115292 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.115275 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.118049 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.118029 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:10:09.118166 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.118033 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:10:09.118166 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.118077 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 19:10:09.119274 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.119245 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 19:10:09.119274 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.119264 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-jw9ff\"" Apr 20 19:10:09.127313 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.127292 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 19:10:09.127967 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.127949 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-slmw8"] Apr 20 19:10:09.162946 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.162909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4cf6648c-0d3c-45e5-877a-2ef099dd0653-snapshots\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.163084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.162951 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cf6648c-0d3c-45e5-877a-2ef099dd0653-service-ca-bundle\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.163084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.162984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmv6\" (UniqueName: \"kubernetes.io/projected/4cf6648c-0d3c-45e5-877a-2ef099dd0653-kube-api-access-4kmv6\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.163084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.163069 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cf6648c-0d3c-45e5-877a-2ef099dd0653-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.163214 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.163164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cf6648c-0d3c-45e5-877a-2ef099dd0653-tmp\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.163214 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.163185 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cf6648c-0d3c-45e5-877a-2ef099dd0653-serving-cert\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.214044 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.214014 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw"] Apr 20 19:10:09.215929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.215916 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.218771 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.218749 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-cvp94\"" Apr 20 19:10:09.218890 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.218836 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:10:09.218890 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.218876 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 19:10:09.218974 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.218940 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:10:09.219991 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.219971 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 19:10:09.224734 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.224714 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw"] Apr 20 19:10:09.264267 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4cf6648c-0d3c-45e5-877a-2ef099dd0653-snapshots\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264267 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264272 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cf6648c-0d3c-45e5-877a-2ef099dd0653-service-ca-bundle\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264459 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264297 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.264459 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264336 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kmv6\" (UniqueName: \"kubernetes.io/projected/4cf6648c-0d3c-45e5-877a-2ef099dd0653-kube-api-access-4kmv6\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264459 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264355 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/89999f13-84c4-4b08-a865-c986f4298fcb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.264459 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264371 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsdfx\" (UniqueName: \"kubernetes.io/projected/89999f13-84c4-4b08-a865-c986f4298fcb-kube-api-access-qsdfx\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.264661 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cf6648c-0d3c-45e5-877a-2ef099dd0653-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264661 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264560 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cf6648c-0d3c-45e5-877a-2ef099dd0653-tmp\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264661 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cf6648c-0d3c-45e5-877a-2ef099dd0653-serving-cert\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264908 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264885 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cf6648c-0d3c-45e5-877a-2ef099dd0653-service-ca-bundle\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264972 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cf6648c-0d3c-45e5-877a-2ef099dd0653-tmp\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.264972 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.264932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4cf6648c-0d3c-45e5-877a-2ef099dd0653-snapshots\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.265295 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.265280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cf6648c-0d3c-45e5-877a-2ef099dd0653-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.267489 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.267470 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4cf6648c-0d3c-45e5-877a-2ef099dd0653-serving-cert\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.273763 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.273736 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kmv6\" (UniqueName: \"kubernetes.io/projected/4cf6648c-0d3c-45e5-877a-2ef099dd0653-kube-api-access-4kmv6\") pod \"insights-operator-585dfdc468-slmw8\" (UID: \"4cf6648c-0d3c-45e5-877a-2ef099dd0653\") " pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.365767 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.365693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.365767 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.365733 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/89999f13-84c4-4b08-a865-c986f4298fcb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.365767 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.365750 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qsdfx\" (UniqueName: \"kubernetes.io/projected/89999f13-84c4-4b08-a865-c986f4298fcb-kube-api-access-qsdfx\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.365989 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:09.365849 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:09.365989 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:09.365937 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls podName:89999f13-84c4-4b08-a865-c986f4298fcb nodeName:}" failed. No retries permitted until 2026-04-20 19:10:09.865916552 +0000 UTC m=+134.779412595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vnjmw" (UID: "89999f13-84c4-4b08-a865-c986f4298fcb") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:09.366449 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.366432 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/89999f13-84c4-4b08-a865-c986f4298fcb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.374434 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.374411 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsdfx\" (UniqueName: \"kubernetes.io/projected/89999f13-84c4-4b08-a865-c986f4298fcb-kube-api-access-qsdfx\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.425569 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.425534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-slmw8" Apr 20 19:10:09.541400 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.541251 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-slmw8"] Apr 20 19:10:09.543787 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:09.543752 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cf6648c_0d3c_45e5_877a_2ef099dd0653.slice/crio-1065a06060db5b147982d7a5f434cee695e2f526d99544db4469971e296509d5 WatchSource:0}: Error finding container 1065a06060db5b147982d7a5f434cee695e2f526d99544db4469971e296509d5: Status 404 returned error can't find the container with id 1065a06060db5b147982d7a5f434cee695e2f526d99544db4469971e296509d5 Apr 20 19:10:09.869276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:09.869237 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:09.869449 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:09.869367 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:09.869449 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:09.869429 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls podName:89999f13-84c4-4b08-a865-c986f4298fcb nodeName:}" failed. No retries permitted until 2026-04-20 19:10:10.869414455 +0000 UTC m=+135.782910485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vnjmw" (UID: "89999f13-84c4-4b08-a865-c986f4298fcb") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:10.032276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:10.032239 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-slmw8" event={"ID":"4cf6648c-0d3c-45e5-877a-2ef099dd0653","Type":"ContainerStarted","Data":"1065a06060db5b147982d7a5f434cee695e2f526d99544db4469971e296509d5"} Apr 20 19:10:10.877480 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:10.877434 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:10.878043 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:10.877601 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:10.878043 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:10.877698 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls podName:89999f13-84c4-4b08-a865-c986f4298fcb nodeName:}" failed. No retries permitted until 2026-04-20 19:10:12.877676124 +0000 UTC m=+137.791172157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vnjmw" (UID: "89999f13-84c4-4b08-a865-c986f4298fcb") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:12.037517 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:12.037477 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-slmw8" event={"ID":"4cf6648c-0d3c-45e5-877a-2ef099dd0653","Type":"ContainerStarted","Data":"340ceec1527b12f6b601a950b7b984cbebc4a884e2c6aa2532277d3c91926505"} Apr 20 19:10:12.055667 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:12.055619 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-slmw8" podStartSLOduration=1.200322324 podStartE2EDuration="3.055605254s" podCreationTimestamp="2026-04-20 19:10:09 +0000 UTC" firstStartedPulling="2026-04-20 19:10:09.545479143 +0000 UTC m=+134.458975173" lastFinishedPulling="2026-04-20 19:10:11.400762069 +0000 UTC m=+136.314258103" observedRunningTime="2026-04-20 19:10:12.055418785 +0000 UTC m=+136.968914860" watchObservedRunningTime="2026-04-20 19:10:12.055605254 +0000 UTC m=+136.969101296" Apr 20 19:10:12.893826 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:12.893790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:12.893988 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:12.893912 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:12.893988 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:12.893970 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls podName:89999f13-84c4-4b08-a865-c986f4298fcb nodeName:}" failed. No retries permitted until 2026-04-20 19:10:16.893953176 +0000 UTC m=+141.807449208 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vnjmw" (UID: "89999f13-84c4-4b08-a865-c986f4298fcb") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:14.050143 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:14.050091 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k97mn_79e9d9f2-39b0-4a1d-9e82-98ceca85b745/dns-node-resolver/0.log" Apr 20 19:10:15.049701 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:15.049672 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jq7q4_6afa7914-d2d0-4077-b293-73873dd1cb3e/node-ca/0.log" Apr 20 19:10:16.925346 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.925303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:16.925731 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:16.925449 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:16.925731 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:16.925515 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls podName:89999f13-84c4-4b08-a865-c986f4298fcb nodeName:}" failed. No retries permitted until 2026-04-20 19:10:24.925497319 +0000 UTC m=+149.838993349 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vnjmw" (UID: "89999f13-84c4-4b08-a865-c986f4298fcb") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:16.968120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.968076 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz"] Apr 20 19:10:16.969949 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.969934 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" Apr 20 19:10:16.972651 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.972630 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 19:10:16.973805 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.973787 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:10:16.973889 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.973872 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-htk9d\"" Apr 20 19:10:16.978920 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:16.978900 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz"] Apr 20 19:10:17.026487 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.026437 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm26x\" (UniqueName: \"kubernetes.io/projected/f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c-kube-api-access-cm26x\") pod \"volume-data-source-validator-7c6cbb6c87-ppjpz\" (UID: \"f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" Apr 20 19:10:17.104898 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.104866 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s"] Apr 20 19:10:17.106784 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.106764 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.110081 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.110063 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g8xpd\"" Apr 20 19:10:17.110596 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.110568 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 19:10:17.111140 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.111125 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 19:10:17.111732 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.111716 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:10:17.127770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.127741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cm26x\" (UniqueName: \"kubernetes.io/projected/f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c-kube-api-access-cm26x\") pod \"volume-data-source-validator-7c6cbb6c87-ppjpz\" (UID: \"f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" Apr 20 19:10:17.132791 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.132767 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s"] Apr 20 19:10:17.150386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.150361 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm26x\" (UniqueName: \"kubernetes.io/projected/f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c-kube-api-access-cm26x\") pod \"volume-data-source-validator-7c6cbb6c87-ppjpz\" (UID: \"f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" Apr 20 19:10:17.228496 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.228414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.228496 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.228450 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5749g\" (UniqueName: \"kubernetes.io/projected/e4971f32-3960-4d1c-8044-7422cab605b1-kube-api-access-5749g\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.277861 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.277822 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" Apr 20 19:10:17.329081 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.329051 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.329233 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.329089 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5749g\" (UniqueName: \"kubernetes.io/projected/e4971f32-3960-4d1c-8044-7422cab605b1-kube-api-access-5749g\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.329294 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:17.329253 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:10:17.329343 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:17.329327 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls podName:e4971f32-3960-4d1c-8044-7422cab605b1 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:17.829305723 +0000 UTC m=+142.742801773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t2m6s" (UID: "e4971f32-3960-4d1c-8044-7422cab605b1") : secret "samples-operator-tls" not found Apr 20 19:10:17.338456 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.338423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5749g\" (UniqueName: \"kubernetes.io/projected/e4971f32-3960-4d1c-8044-7422cab605b1-kube-api-access-5749g\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.395010 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.394980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz"] Apr 20 19:10:17.397821 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:17.397788 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12f4cad_52fc_4e03_8ed1_7fe6e9596b7c.slice/crio-9b9b779439453df468dc5eb08ec3be67cfce08c2e6185a5a2130be01263500fe WatchSource:0}: Error finding container 9b9b779439453df468dc5eb08ec3be67cfce08c2e6185a5a2130be01263500fe: Status 404 returned error can't find the container with id 9b9b779439453df468dc5eb08ec3be67cfce08c2e6185a5a2130be01263500fe Apr 20 19:10:17.832627 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:17.832588 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:17.832806 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:17.832748 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:10:17.832848 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:17.832827 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls podName:e4971f32-3960-4d1c-8044-7422cab605b1 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:18.83281091 +0000 UTC m=+143.746306940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t2m6s" (UID: "e4971f32-3960-4d1c-8044-7422cab605b1") : secret "samples-operator-tls" not found Apr 20 19:10:18.049461 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:18.049422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" event={"ID":"f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c","Type":"ContainerStarted","Data":"9b9b779439453df468dc5eb08ec3be67cfce08c2e6185a5a2130be01263500fe"} Apr 20 19:10:18.840777 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:18.840734 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:18.840983 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:18.840917 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:10:18.841047 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:18.841002 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls podName:e4971f32-3960-4d1c-8044-7422cab605b1 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:20.84098062 +0000 UTC m=+145.754476667 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t2m6s" (UID: "e4971f32-3960-4d1c-8044-7422cab605b1") : secret "samples-operator-tls" not found Apr 20 19:10:19.052161 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:19.052126 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" event={"ID":"f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c","Type":"ContainerStarted","Data":"8bf01bfa307a15a9cfb31a25b1fc79b45fb7720daae617e545c761be977addea"} Apr 20 19:10:19.067425 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:19.067370 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ppjpz" podStartSLOduration=1.802920718 podStartE2EDuration="3.067355304s" podCreationTimestamp="2026-04-20 19:10:16 +0000 UTC" firstStartedPulling="2026-04-20 19:10:17.399657645 +0000 UTC m=+142.313153690" lastFinishedPulling="2026-04-20 19:10:18.664092242 +0000 UTC m=+143.577588276" observedRunningTime="2026-04-20 19:10:19.066392101 +0000 UTC m=+143.979888156" watchObservedRunningTime="2026-04-20 19:10:19.067355304 +0000 UTC m=+143.980851355" Apr 20 19:10:20.856832 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:20.856793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:20.857335 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:20.856979 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:10:20.857335 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:20.857065 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls podName:e4971f32-3960-4d1c-8044-7422cab605b1 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:24.857043182 +0000 UTC m=+149.770539234 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t2m6s" (UID: "e4971f32-3960-4d1c-8044-7422cab605b1") : secret "samples-operator-tls" not found Apr 20 19:10:22.869653 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.869627 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr"] Apr 20 19:10:22.871497 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.871482 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" Apr 20 19:10:22.873965 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.873946 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 19:10:22.874242 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.874226 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 19:10:22.875249 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.875229 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-s9gkn\"" Apr 20 19:10:22.885167 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.885146 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr"] Apr 20 19:10:22.976102 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:22.976070 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p92q\" (UniqueName: \"kubernetes.io/projected/37fe3341-d630-483c-b4c5-f4ce6bedf386-kube-api-access-4p92q\") pod \"migrator-74bb7799d9-44chr\" (UID: \"37fe3341-d630-483c-b4c5-f4ce6bedf386\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" Apr 20 19:10:23.076943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:23.076907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p92q\" (UniqueName: \"kubernetes.io/projected/37fe3341-d630-483c-b4c5-f4ce6bedf386-kube-api-access-4p92q\") pod \"migrator-74bb7799d9-44chr\" (UID: \"37fe3341-d630-483c-b4c5-f4ce6bedf386\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" Apr 20 19:10:23.085247 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:23.085227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p92q\" (UniqueName: \"kubernetes.io/projected/37fe3341-d630-483c-b4c5-f4ce6bedf386-kube-api-access-4p92q\") pod \"migrator-74bb7799d9-44chr\" (UID: \"37fe3341-d630-483c-b4c5-f4ce6bedf386\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" Apr 20 19:10:23.180428 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:23.180393 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" Apr 20 19:10:23.294513 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:23.294479 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr"] Apr 20 19:10:23.297139 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:23.297095 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37fe3341_d630_483c_b4c5_f4ce6bedf386.slice/crio-77584b9fa30f19053654819fbc11c01db62febcb18bd30d3f1de4d697a5c5f1d WatchSource:0}: Error finding container 77584b9fa30f19053654819fbc11c01db62febcb18bd30d3f1de4d697a5c5f1d: Status 404 returned error can't find the container with id 77584b9fa30f19053654819fbc11c01db62febcb18bd30d3f1de4d697a5c5f1d Apr 20 19:10:24.062493 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:24.062452 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" event={"ID":"37fe3341-d630-483c-b4c5-f4ce6bedf386","Type":"ContainerStarted","Data":"77584b9fa30f19053654819fbc11c01db62febcb18bd30d3f1de4d697a5c5f1d"} Apr 20 19:10:24.891145 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:24.891090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:24.891289 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:24.891242 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 19:10:24.891343 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:24.891306 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls podName:e4971f32-3960-4d1c-8044-7422cab605b1 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:32.891289572 +0000 UTC m=+157.804785601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-t2m6s" (UID: "e4971f32-3960-4d1c-8044-7422cab605b1") : secret "samples-operator-tls" not found Apr 20 19:10:24.991915 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:24.991821 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:24.992077 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:24.991985 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:24.992077 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:24.992067 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls podName:89999f13-84c4-4b08-a865-c986f4298fcb nodeName:}" failed. No retries permitted until 2026-04-20 19:10:40.99204736 +0000 UTC m=+165.905543410 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vnjmw" (UID: "89999f13-84c4-4b08-a865-c986f4298fcb") : secret "cluster-monitoring-operator-tls" not found Apr 20 19:10:25.066363 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.066324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" event={"ID":"37fe3341-d630-483c-b4c5-f4ce6bedf386","Type":"ContainerStarted","Data":"20acc0db883b189dbf307ca6ebac6ad7d4c2e386558a2d3d1e5a0e1d0ee2c7e2"} Apr 20 19:10:25.066363 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.066367 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" event={"ID":"37fe3341-d630-483c-b4c5-f4ce6bedf386","Type":"ContainerStarted","Data":"a1ae74a8c2079e5493d689d1c288333a09e7c951b077981c69fde8dc65a9f20f"} Apr 20 19:10:25.083668 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.083621 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-44chr" podStartSLOduration=1.6634265990000001 podStartE2EDuration="3.083605748s" podCreationTimestamp="2026-04-20 19:10:22 +0000 UTC" firstStartedPulling="2026-04-20 19:10:23.298867493 +0000 UTC m=+148.212363523" lastFinishedPulling="2026-04-20 19:10:24.719046636 +0000 UTC m=+149.632542672" observedRunningTime="2026-04-20 19:10:25.082034397 +0000 UTC m=+149.995530449" watchObservedRunningTime="2026-04-20 19:10:25.083605748 +0000 UTC m=+149.997101851" Apr 20 19:10:25.698658 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.696411 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5bdwt"] Apr 20 19:10:25.699157 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.699141 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.701956 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.701929 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 19:10:25.702084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.701982 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 19:10:25.702084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.701983 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 19:10:25.703072 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.703055 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-5hmvn\"" Apr 20 19:10:25.703169 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.703142 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 19:10:25.706237 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.706215 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5bdwt"] Apr 20 19:10:25.716605 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.716583 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wzddr"] Apr 20 19:10:25.718635 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.718621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.721411 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.721392 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qwcxh\"" Apr 20 19:10:25.721506 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.721447 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:10:25.721726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.721710 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:10:25.730408 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.730379 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wzddr"] Apr 20 19:10:25.800510 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800472 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5b60c956-b51d-42bf-832c-f7bc88f97e12-signing-key\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.800510 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800509 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5b60c956-b51d-42bf-832c-f7bc88f97e12-signing-cabundle\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.800758 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800555 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88j4\" (UniqueName: \"kubernetes.io/projected/5b60c956-b51d-42bf-832c-f7bc88f97e12-kube-api-access-v88j4\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.800758 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800625 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/53834f86-30e8-4fea-bc1c-05405718e03a-data-volume\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.800758 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800669 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/53834f86-30e8-4fea-bc1c-05405718e03a-crio-socket\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.800758 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800707 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f77vj\" (UniqueName: \"kubernetes.io/projected/53834f86-30e8-4fea-bc1c-05405718e03a-kube-api-access-f77vj\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.800969 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800768 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/53834f86-30e8-4fea-bc1c-05405718e03a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.800969 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.800805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901162 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901122 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v88j4\" (UniqueName: \"kubernetes.io/projected/5b60c956-b51d-42bf-832c-f7bc88f97e12-kube-api-access-v88j4\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.901371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/53834f86-30e8-4fea-bc1c-05405718e03a-data-volume\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901195 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/53834f86-30e8-4fea-bc1c-05405718e03a-crio-socket\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f77vj\" (UniqueName: \"kubernetes.io/projected/53834f86-30e8-4fea-bc1c-05405718e03a-kube-api-access-f77vj\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/53834f86-30e8-4fea-bc1c-05405718e03a-crio-socket\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901388 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/53834f86-30e8-4fea-bc1c-05405718e03a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901429 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901516 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/53834f86-30e8-4fea-bc1c-05405718e03a-data-volume\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901535 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5b60c956-b51d-42bf-832c-f7bc88f97e12-signing-key\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:25.901522 2571 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5b60c956-b51d-42bf-832c-f7bc88f97e12-signing-cabundle\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.901622 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:25.901609 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls podName:53834f86-30e8-4fea-bc1c-05405718e03a nodeName:}" failed. No retries permitted until 2026-04-20 19:10:26.401589851 +0000 UTC m=+151.315086015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wzddr" (UID: "53834f86-30e8-4fea-bc1c-05405718e03a") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:25.901999 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.901835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/53834f86-30e8-4fea-bc1c-05405718e03a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:25.902304 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.902283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5b60c956-b51d-42bf-832c-f7bc88f97e12-signing-cabundle\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.904140 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.904103 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5b60c956-b51d-42bf-832c-f7bc88f97e12-signing-key\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.911926 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.911902 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88j4\" (UniqueName: \"kubernetes.io/projected/5b60c956-b51d-42bf-832c-f7bc88f97e12-kube-api-access-v88j4\") pod \"service-ca-865cb79987-5bdwt\" (UID: \"5b60c956-b51d-42bf-832c-f7bc88f97e12\") " pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:25.912037 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:25.912019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f77vj\" (UniqueName: \"kubernetes.io/projected/53834f86-30e8-4fea-bc1c-05405718e03a-kube-api-access-f77vj\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:26.007912 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:26.007828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5bdwt" Apr 20 19:10:26.122143 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:26.122093 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5bdwt"] Apr 20 19:10:26.126103 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:26.126076 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b60c956_b51d_42bf_832c_f7bc88f97e12.slice/crio-288d4b39986aa0ebd31637c7739401210020201802476173112b53d46863220d WatchSource:0}: Error finding container 288d4b39986aa0ebd31637c7739401210020201802476173112b53d46863220d: Status 404 returned error can't find the container with id 288d4b39986aa0ebd31637c7739401210020201802476173112b53d46863220d Apr 20 19:10:26.404029 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:26.403938 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:26.404204 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:26.404090 2571 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:26.404204 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:26.404169 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls podName:53834f86-30e8-4fea-bc1c-05405718e03a nodeName:}" failed. No retries permitted until 2026-04-20 19:10:27.404153169 +0000 UTC m=+152.317649198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wzddr" (UID: "53834f86-30e8-4fea-bc1c-05405718e03a") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.074952 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:27.074907 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5bdwt" event={"ID":"5b60c956-b51d-42bf-832c-f7bc88f97e12","Type":"ContainerStarted","Data":"288d4b39986aa0ebd31637c7739401210020201802476173112b53d46863220d"} Apr 20 19:10:27.410666 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:27.410572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:27.411065 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:27.410755 2571 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:27.411065 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:27.410859 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls podName:53834f86-30e8-4fea-bc1c-05405718e03a nodeName:}" failed. No retries permitted until 2026-04-20 19:10:29.410825846 +0000 UTC m=+154.324321880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wzddr" (UID: "53834f86-30e8-4fea-bc1c-05405718e03a") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:28.078670 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:28.078576 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5bdwt" event={"ID":"5b60c956-b51d-42bf-832c-f7bc88f97e12","Type":"ContainerStarted","Data":"1d8af229cf4ac4a23b6f96e707776ca8a5efeb2beb68e9f06c5919af72d2b823"} Apr 20 19:10:29.429251 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:29.429210 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:29.429642 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:29.429386 2571 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:29.429642 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:29.429455 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls podName:53834f86-30e8-4fea-bc1c-05405718e03a nodeName:}" failed. No retries permitted until 2026-04-20 19:10:33.429438101 +0000 UTC m=+158.342934131 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wzddr" (UID: "53834f86-30e8-4fea-bc1c-05405718e03a") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:31.923091 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:31.923043 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" Apr 20 19:10:31.935217 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:31.935183 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" podUID="d4955879-53ea-4f74-96dd-eae67dbbe030" Apr 20 19:10:31.945324 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:31.945300 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7kxxk" podUID="4937ec4e-cf01-4c28-aa08-5a8a7d722ff9" Apr 20 19:10:31.952507 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:31.952472 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-mwrz2" podUID="f543f92c-946e-4d67-ad5f-48be19c49af7" Apr 20 19:10:32.090184 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:32.090151 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7kxxk" Apr 20 19:10:32.090376 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:32.090150 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:10:32.961828 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:32.961787 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:32.964245 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:32.964222 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e4971f32-3960-4d1c-8044-7422cab605b1-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-t2m6s\" (UID: \"e4971f32-3960-4d1c-8044-7422cab605b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:33.014911 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:33.014870 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" Apr 20 19:10:33.142692 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:33.142636 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5bdwt" podStartSLOduration=6.582474002 podStartE2EDuration="8.142619907s" podCreationTimestamp="2026-04-20 19:10:25 +0000 UTC" firstStartedPulling="2026-04-20 19:10:26.127926237 +0000 UTC m=+151.041422266" lastFinishedPulling="2026-04-20 19:10:27.688072127 +0000 UTC m=+152.601568171" observedRunningTime="2026-04-20 19:10:28.094596849 +0000 UTC m=+153.008092901" watchObservedRunningTime="2026-04-20 19:10:33.142619907 +0000 UTC m=+158.056116236" Apr 20 19:10:33.143633 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:33.143605 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s"] Apr 20 19:10:33.465952 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:33.465914 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:33.466143 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:33.466099 2571 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:10:33.466222 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:33.466209 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls podName:53834f86-30e8-4fea-bc1c-05405718e03a nodeName:}" failed. No retries permitted until 2026-04-20 19:10:41.466187352 +0000 UTC m=+166.379683386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wzddr" (UID: "53834f86-30e8-4fea-bc1c-05405718e03a") : secret "insights-runtime-extractor-tls" not found Apr 20 19:10:33.707122 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:33.707061 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ff5pq" podUID="cc31ab16-2946-4d9a-baee-c02a00b73aae" Apr 20 19:10:34.097335 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:34.097299 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" event={"ID":"e4971f32-3960-4d1c-8044-7422cab605b1","Type":"ContainerStarted","Data":"042cef2523560beebbec05bdc28ed53539b0d332ed097d84fbd3b8f2125496ff"} Apr 20 19:10:35.101289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:35.101197 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" event={"ID":"e4971f32-3960-4d1c-8044-7422cab605b1","Type":"ContainerStarted","Data":"f2b8a0a0ee2095b270ec6f439f22dc9275a32ba16147e894445b4defafc6c2dc"} Apr 20 19:10:35.101289 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:35.101238 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" event={"ID":"e4971f32-3960-4d1c-8044-7422cab605b1","Type":"ContainerStarted","Data":"3d95e4ca57d5cfcab8d3f1246c716d2df8985176039879cc83029097732b45d4"} Apr 20 19:10:35.119248 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:35.119199 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-t2m6s" podStartSLOduration=16.608695713 podStartE2EDuration="18.119186533s" podCreationTimestamp="2026-04-20 19:10:17 +0000 UTC" firstStartedPulling="2026-04-20 19:10:33.179853691 +0000 UTC m=+158.093349721" lastFinishedPulling="2026-04-20 19:10:34.690344511 +0000 UTC m=+159.603840541" observedRunningTime="2026-04-20 19:10:35.117694723 +0000 UTC m=+160.031190776" watchObservedRunningTime="2026-04-20 19:10:35.119186533 +0000 UTC m=+160.032682682" Apr 20 19:10:36.895657 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.895610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:10:36.895657 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.895659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:10:36.896081 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:36.895770 2571 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:10:36.896081 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:36.895836 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert podName:d4955879-53ea-4f74-96dd-eae67dbbe030 nodeName:}" failed. No retries permitted until 2026-04-20 19:12:38.895818891 +0000 UTC m=+283.809314939 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4hs99" (UID: "d4955879-53ea-4f74-96dd-eae67dbbe030") : secret "networking-console-plugin-cert" not found Apr 20 19:10:36.898052 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.898019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"image-registry-9d6cb88f9-f5dkx\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:10:36.996386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.996353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:10:36.996386 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.996392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:10:36.998787 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.998761 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f543f92c-946e-4d67-ad5f-48be19c49af7-cert\") pod \"ingress-canary-mwrz2\" (UID: \"f543f92c-946e-4d67-ad5f-48be19c49af7\") " pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:10:36.999234 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:36.999217 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4937ec4e-cf01-4c28-aa08-5a8a7d722ff9-metrics-tls\") pod \"dns-default-7kxxk\" (UID: \"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9\") " pod="openshift-dns/dns-default-7kxxk" Apr 20 19:10:37.194605 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:37.194569 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ggf2q\"" Apr 20 19:10:37.194605 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:37.194574 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w68rf\"" Apr 20 19:10:37.201053 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:37.201032 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7kxxk" Apr 20 19:10:37.201203 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:37.201067 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:10:37.331646 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:37.331622 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7kxxk"] Apr 20 19:10:37.333962 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:37.333935 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4937ec4e_cf01_4c28_aa08_5a8a7d722ff9.slice/crio-fc7f546398e66700dce7522076d61203c0c49c8a05ce1fcc42115497ac5195fd WatchSource:0}: Error finding container fc7f546398e66700dce7522076d61203c0c49c8a05ce1fcc42115497ac5195fd: Status 404 returned error can't find the container with id fc7f546398e66700dce7522076d61203c0c49c8a05ce1fcc42115497ac5195fd Apr 20 19:10:37.349093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:37.349063 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9d6cb88f9-f5dkx"] Apr 20 19:10:37.351494 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:37.351469 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd54e0b0_d5ef_4e48_92e1_1f6a9c491f19.slice/crio-19b5ce152254f8bdaa6b950e0acd16ae83682ab48b9afbeb5f93c526141de77d WatchSource:0}: Error finding container 19b5ce152254f8bdaa6b950e0acd16ae83682ab48b9afbeb5f93c526141de77d: Status 404 returned error can't find the container with id 19b5ce152254f8bdaa6b950e0acd16ae83682ab48b9afbeb5f93c526141de77d Apr 20 19:10:38.110702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:38.110612 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" event={"ID":"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19","Type":"ContainerStarted","Data":"b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e"} Apr 20 19:10:38.110702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:38.110661 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" event={"ID":"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19","Type":"ContainerStarted","Data":"19b5ce152254f8bdaa6b950e0acd16ae83682ab48b9afbeb5f93c526141de77d"} Apr 20 19:10:38.111264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:38.111177 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:10:38.116558 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:38.116532 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7kxxk" event={"ID":"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9","Type":"ContainerStarted","Data":"fc7f546398e66700dce7522076d61203c0c49c8a05ce1fcc42115497ac5195fd"} Apr 20 19:10:38.133683 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:38.133628 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" podStartSLOduration=162.133610448 podStartE2EDuration="2m42.133610448s" podCreationTimestamp="2026-04-20 19:07:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:10:38.132313216 +0000 UTC m=+163.045809281" watchObservedRunningTime="2026-04-20 19:10:38.133610448 +0000 UTC m=+163.047106501" Apr 20 19:10:39.122339 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:39.122281 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7kxxk" event={"ID":"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9","Type":"ContainerStarted","Data":"0bddb5f2ff8afcad16af43b88fbb52ff77025b5bd2873ec6fd87a1add597acfc"} Apr 20 19:10:39.122339 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:39.122345 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7kxxk" event={"ID":"4937ec4e-cf01-4c28-aa08-5a8a7d722ff9","Type":"ContainerStarted","Data":"83b21fd9cc714453c765f39b69b7b05d1b9b8b04d37e06d5032b0175a60d0165"} Apr 20 19:10:39.140628 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:39.140572 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7kxxk" podStartSLOduration=129.817380296 podStartE2EDuration="2m11.140557163s" podCreationTimestamp="2026-04-20 19:08:28 +0000 UTC" firstStartedPulling="2026-04-20 19:10:37.335822857 +0000 UTC m=+162.249318888" lastFinishedPulling="2026-04-20 19:10:38.658999711 +0000 UTC m=+163.572495755" observedRunningTime="2026-04-20 19:10:39.140142919 +0000 UTC m=+164.053638972" watchObservedRunningTime="2026-04-20 19:10:39.140557163 +0000 UTC m=+164.054053214" Apr 20 19:10:40.124625 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:40.124583 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7kxxk" Apr 20 19:10:41.033715 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.033673 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:41.036019 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.035998 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/89999f13-84c4-4b08-a865-c986f4298fcb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vnjmw\" (UID: \"89999f13-84c4-4b08-a865-c986f4298fcb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:41.324758 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.324689 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" Apr 20 19:10:41.440995 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.440959 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw"] Apr 20 19:10:41.445123 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:41.445078 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89999f13_84c4_4b08_a865_c986f4298fcb.slice/crio-7f286f2ed28663e65f9eea78d542b767f0ed0ddb02b94eeb921e52bb41c91dca WatchSource:0}: Error finding container 7f286f2ed28663e65f9eea78d542b767f0ed0ddb02b94eeb921e52bb41c91dca: Status 404 returned error can't find the container with id 7f286f2ed28663e65f9eea78d542b767f0ed0ddb02b94eeb921e52bb41c91dca Apr 20 19:10:41.538704 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.538669 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:41.540892 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.540863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/53834f86-30e8-4fea-bc1c-05405718e03a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wzddr\" (UID: \"53834f86-30e8-4fea-bc1c-05405718e03a\") " pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:41.627023 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.626930 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wzddr" Apr 20 19:10:41.741977 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:41.741950 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wzddr"] Apr 20 19:10:41.744787 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:41.744756 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53834f86_30e8_4fea_bc1c_05405718e03a.slice/crio-ee48601e526cad4a89ab32e2b89a10aca257570b00d239c2399bad144ad784e9 WatchSource:0}: Error finding container ee48601e526cad4a89ab32e2b89a10aca257570b00d239c2399bad144ad784e9: Status 404 returned error can't find the container with id ee48601e526cad4a89ab32e2b89a10aca257570b00d239c2399bad144ad784e9 Apr 20 19:10:42.132457 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.132418 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wzddr" event={"ID":"53834f86-30e8-4fea-bc1c-05405718e03a","Type":"ContainerStarted","Data":"7cec275ec464d35b8e1831fd3ec7aabd3b4c77b0d08a972dcada14c3dba7d4e6"} Apr 20 19:10:42.132609 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.132464 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wzddr" event={"ID":"53834f86-30e8-4fea-bc1c-05405718e03a","Type":"ContainerStarted","Data":"ee48601e526cad4a89ab32e2b89a10aca257570b00d239c2399bad144ad784e9"} Apr 20 19:10:42.133327 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.133302 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" event={"ID":"89999f13-84c4-4b08-a865-c986f4298fcb","Type":"ContainerStarted","Data":"7f286f2ed28663e65f9eea78d542b767f0ed0ddb02b94eeb921e52bb41c91dca"} Apr 20 19:10:42.687077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.687036 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:10:42.687077 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.687070 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:10:42.690164 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.690142 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gtxhr\"" Apr 20 19:10:42.698214 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:42.698195 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mwrz2" Apr 20 19:10:43.091903 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:43.091869 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mwrz2"] Apr 20 19:10:43.095338 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:43.095299 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf543f92c_946e_4d67_ad5f_48be19c49af7.slice/crio-73f9fa5deb032d9c5d7ca5bf4382effd5caebf9da312972481b1899657c88698 WatchSource:0}: Error finding container 73f9fa5deb032d9c5d7ca5bf4382effd5caebf9da312972481b1899657c88698: Status 404 returned error can't find the container with id 73f9fa5deb032d9c5d7ca5bf4382effd5caebf9da312972481b1899657c88698 Apr 20 19:10:43.137469 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:43.137419 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" event={"ID":"89999f13-84c4-4b08-a865-c986f4298fcb","Type":"ContainerStarted","Data":"d3daef18c6720b38ca3f88eb880a8fa5641cdde7ba72dd33e5174f46825452e7"} Apr 20 19:10:43.139169 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:43.139140 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wzddr" event={"ID":"53834f86-30e8-4fea-bc1c-05405718e03a","Type":"ContainerStarted","Data":"f8e1b384274a524fe01b1a2878eeb1031a5c06f5263b3347af2c239d22756ba7"} Apr 20 19:10:43.140328 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:43.140297 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mwrz2" event={"ID":"f543f92c-946e-4d67-ad5f-48be19c49af7","Type":"ContainerStarted","Data":"73f9fa5deb032d9c5d7ca5bf4382effd5caebf9da312972481b1899657c88698"} Apr 20 19:10:43.156894 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:43.156495 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vnjmw" podStartSLOduration=32.608373591 podStartE2EDuration="34.156478904s" podCreationTimestamp="2026-04-20 19:10:09 +0000 UTC" firstStartedPulling="2026-04-20 19:10:41.446976566 +0000 UTC m=+166.360472610" lastFinishedPulling="2026-04-20 19:10:42.99508189 +0000 UTC m=+167.908577923" observedRunningTime="2026-04-20 19:10:43.154250888 +0000 UTC m=+168.067746941" watchObservedRunningTime="2026-04-20 19:10:43.156478904 +0000 UTC m=+168.069974959" Apr 20 19:10:44.687615 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:44.687579 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:10:45.147996 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:45.147906 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wzddr" event={"ID":"53834f86-30e8-4fea-bc1c-05405718e03a","Type":"ContainerStarted","Data":"cde8a3608fbb48992a4c699760ab3918a8ff46cf4bf5967f94f499cfed9c1931"} Apr 20 19:10:45.149221 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:45.149184 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mwrz2" event={"ID":"f543f92c-946e-4d67-ad5f-48be19c49af7","Type":"ContainerStarted","Data":"ff1c20024c84233d56fd3475da6a5a99ca85b64bd054fd2970beeb18dc4bb718"} Apr 20 19:10:45.167696 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:45.167657 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wzddr" podStartSLOduration=17.866972083 podStartE2EDuration="20.1676457s" podCreationTimestamp="2026-04-20 19:10:25 +0000 UTC" firstStartedPulling="2026-04-20 19:10:41.794706325 +0000 UTC m=+166.708202354" lastFinishedPulling="2026-04-20 19:10:44.095379938 +0000 UTC m=+169.008875971" observedRunningTime="2026-04-20 19:10:45.167326176 +0000 UTC m=+170.080822229" watchObservedRunningTime="2026-04-20 19:10:45.1676457 +0000 UTC m=+170.081141744" Apr 20 19:10:45.182503 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:45.182462 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mwrz2" podStartSLOduration=135.52730574 podStartE2EDuration="2m17.18245089s" podCreationTimestamp="2026-04-20 19:08:28 +0000 UTC" firstStartedPulling="2026-04-20 19:10:43.100471101 +0000 UTC m=+168.013967144" lastFinishedPulling="2026-04-20 19:10:44.755616264 +0000 UTC m=+169.669112294" observedRunningTime="2026-04-20 19:10:45.181995562 +0000 UTC m=+170.095491604" watchObservedRunningTime="2026-04-20 19:10:45.18245089 +0000 UTC m=+170.095946957" Apr 20 19:10:46.824830 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.824794 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-6p5l6"] Apr 20 19:10:46.827309 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.827286 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:10:46.831340 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.831316 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 19:10:46.831850 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.831832 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-cwkxt\"" Apr 20 19:10:46.832740 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.832707 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 19:10:46.838398 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.838378 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q"] Apr 20 19:10:46.840345 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.840324 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:46.843158 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.843136 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-45x7f\"" Apr 20 19:10:46.843619 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.843595 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 19:10:46.845088 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.845065 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6p5l6"] Apr 20 19:10:46.848721 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.848700 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q"] Apr 20 19:10:46.880633 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.880590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c3efa91a-e01d-4412-96f9-53621efefdd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-99t2q\" (UID: \"c3efa91a-e01d-4412-96f9-53621efefdd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:46.880795 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.880653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf52s\" (UniqueName: \"kubernetes.io/projected/b945cabb-3adb-4bb8-868d-60d1d730cf72-kube-api-access-nf52s\") pod \"downloads-6bcc868b7-6p5l6\" (UID: \"b945cabb-3adb-4bb8-868d-60d1d730cf72\") " pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:10:46.982003 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.981966 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c3efa91a-e01d-4412-96f9-53621efefdd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-99t2q\" (UID: \"c3efa91a-e01d-4412-96f9-53621efefdd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:46.982212 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.982032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf52s\" (UniqueName: \"kubernetes.io/projected/b945cabb-3adb-4bb8-868d-60d1d730cf72-kube-api-access-nf52s\") pod \"downloads-6bcc868b7-6p5l6\" (UID: \"b945cabb-3adb-4bb8-868d-60d1d730cf72\") " pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:10:46.984686 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.984658 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/c3efa91a-e01d-4412-96f9-53621efefdd3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-99t2q\" (UID: \"c3efa91a-e01d-4412-96f9-53621efefdd3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:46.991545 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:46.991520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf52s\" (UniqueName: \"kubernetes.io/projected/b945cabb-3adb-4bb8-868d-60d1d730cf72-kube-api-access-nf52s\") pod \"downloads-6bcc868b7-6p5l6\" (UID: \"b945cabb-3adb-4bb8-868d-60d1d730cf72\") " pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:10:47.140873 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:47.140792 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:10:47.150537 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:47.150504 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:47.276359 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:47.276336 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6p5l6"] Apr 20 19:10:47.277083 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:47.277055 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb945cabb_3adb_4bb8_868d_60d1d730cf72.slice/crio-6001c6e39988baf509768945f1ec34fbed5d987c0ee29e19a2d1525f30fdff65 WatchSource:0}: Error finding container 6001c6e39988baf509768945f1ec34fbed5d987c0ee29e19a2d1525f30fdff65: Status 404 returned error can't find the container with id 6001c6e39988baf509768945f1ec34fbed5d987c0ee29e19a2d1525f30fdff65 Apr 20 19:10:47.300735 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:47.300711 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q"] Apr 20 19:10:47.302966 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:47.302943 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3efa91a_e01d_4412_96f9_53621efefdd3.slice/crio-4aca42812c4538c2dde464604ecd285e71d78777f30eab59be0293a6edd6992a WatchSource:0}: Error finding container 4aca42812c4538c2dde464604ecd285e71d78777f30eab59be0293a6edd6992a: Status 404 returned error can't find the container with id 4aca42812c4538c2dde464604ecd285e71d78777f30eab59be0293a6edd6992a Apr 20 19:10:48.158891 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:48.158857 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6p5l6" event={"ID":"b945cabb-3adb-4bb8-868d-60d1d730cf72","Type":"ContainerStarted","Data":"6001c6e39988baf509768945f1ec34fbed5d987c0ee29e19a2d1525f30fdff65"} Apr 20 19:10:48.160379 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:48.160350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" event={"ID":"c3efa91a-e01d-4412-96f9-53621efefdd3","Type":"ContainerStarted","Data":"4aca42812c4538c2dde464604ecd285e71d78777f30eab59be0293a6edd6992a"} Apr 20 19:10:49.164638 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.164593 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" event={"ID":"c3efa91a-e01d-4412-96f9-53621efefdd3","Type":"ContainerStarted","Data":"a1ff2f89b61ced4794c15059e02278308146bc8a5a7614a9217411176a0e252d"} Apr 20 19:10:49.165062 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.164800 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:49.171186 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.171155 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" Apr 20 19:10:49.182589 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.182546 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-99t2q" podStartSLOduration=2.042516571 podStartE2EDuration="3.182532138s" podCreationTimestamp="2026-04-20 19:10:46 +0000 UTC" firstStartedPulling="2026-04-20 19:10:47.30461667 +0000 UTC m=+172.218112701" lastFinishedPulling="2026-04-20 19:10:48.444632233 +0000 UTC m=+173.358128268" observedRunningTime="2026-04-20 19:10:49.181573818 +0000 UTC m=+174.095069891" watchObservedRunningTime="2026-04-20 19:10:49.182532138 +0000 UTC m=+174.096028190" Apr 20 19:10:49.616664 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.616578 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9nwvw"] Apr 20 19:10:49.618852 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.618829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.621744 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.621714 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 19:10:49.621744 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.621719 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:10:49.622008 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.621988 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 19:10:49.622826 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.622808 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7ttcx\"" Apr 20 19:10:49.633342 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.633296 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9nwvw"] Apr 20 19:10:49.706164 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.706132 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.706351 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.706254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dr8\" (UniqueName: \"kubernetes.io/projected/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-kube-api-access-l8dr8\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.706351 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.706285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.706351 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.706305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.806989 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.806952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dr8\" (UniqueName: \"kubernetes.io/projected/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-kube-api-access-l8dr8\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.807227 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.806996 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.807227 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.807029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.807227 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.807059 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.807227 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:49.807140 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 19:10:49.807227 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:49.807214 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-tls podName:c1364ffd-2a9a-4f02-941b-d6c600aaebd2 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:50.307192648 +0000 UTC m=+175.220688679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-9nwvw" (UID: "c1364ffd-2a9a-4f02-941b-d6c600aaebd2") : secret "prometheus-operator-tls" not found Apr 20 19:10:49.807883 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.807856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.809951 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.809925 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:49.816868 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:49.816836 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dr8\" (UniqueName: \"kubernetes.io/projected/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-kube-api-access-l8dr8\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:50.129240 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:50.129203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7kxxk" Apr 20 19:10:50.311366 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:50.311319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:50.314174 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:50.314147 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c1364ffd-2a9a-4f02-941b-d6c600aaebd2-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9nwvw\" (UID: \"c1364ffd-2a9a-4f02-941b-d6c600aaebd2\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:50.529960 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:50.529926 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" Apr 20 19:10:50.661918 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:50.661890 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9nwvw"] Apr 20 19:10:50.664387 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:50.664353 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1364ffd_2a9a_4f02_941b_d6c600aaebd2.slice/crio-b4e8beedcb029c072362ef75e0b62b143eb63e9d83ed220c9f767b8671b8b275 WatchSource:0}: Error finding container b4e8beedcb029c072362ef75e0b62b143eb63e9d83ed220c9f767b8671b8b275: Status 404 returned error can't find the container with id b4e8beedcb029c072362ef75e0b62b143eb63e9d83ed220c9f767b8671b8b275 Apr 20 19:10:51.175907 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:51.175869 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" event={"ID":"c1364ffd-2a9a-4f02-941b-d6c600aaebd2","Type":"ContainerStarted","Data":"b4e8beedcb029c072362ef75e0b62b143eb63e9d83ed220c9f767b8671b8b275"} Apr 20 19:10:52.180648 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:52.180528 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" event={"ID":"c1364ffd-2a9a-4f02-941b-d6c600aaebd2","Type":"ContainerStarted","Data":"ecd3f68faf4590204cc2c00953fdd3784901fa92e0adeb77808b062eb3e07d3f"} Apr 20 19:10:52.180648 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:52.180563 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" event={"ID":"c1364ffd-2a9a-4f02-941b-d6c600aaebd2","Type":"ContainerStarted","Data":"b25aac953b2607851438546cd0ccb2fa405491ff8b2508343de79b79b79431d0"} Apr 20 19:10:52.200243 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:52.200182 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9nwvw" podStartSLOduration=1.944812797 podStartE2EDuration="3.200161981s" podCreationTimestamp="2026-04-20 19:10:49 +0000 UTC" firstStartedPulling="2026-04-20 19:10:50.666499521 +0000 UTC m=+175.579995555" lastFinishedPulling="2026-04-20 19:10:51.9218487 +0000 UTC m=+176.835344739" observedRunningTime="2026-04-20 19:10:52.198400312 +0000 UTC m=+177.111896363" watchObservedRunningTime="2026-04-20 19:10:52.200161981 +0000 UTC m=+177.113658034" Apr 20 19:10:53.985547 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:53.985506 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2sxrw"] Apr 20 19:10:53.988100 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:53.988079 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:53.991193 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:53.991171 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 19:10:53.993096 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:53.992898 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 19:10:53.993096 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:53.992938 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-glpxc\"" Apr 20 19:10:53.993264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:53.993199 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 19:10:54.011970 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.011930 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zglz9"] Apr 20 19:10:54.014582 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.014550 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2sxrw"] Apr 20 19:10:54.014717 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.014701 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.020120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.017783 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:10:54.020120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.017917 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lszj4\"" Apr 20 19:10:54.020120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.018213 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:10:54.020120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.018674 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:10:54.045548 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045516 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2h6l\" (UniqueName: \"kubernetes.io/projected/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-api-access-z2h6l\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.045702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045566 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-textfile\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045595 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/817e7306-d6d2-47c0-895a-14dc5408d1d6-metrics-client-ca\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045626 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf31385-2f55-475c-849b-f4ceb3ac894b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.045702 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-root\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045824 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045855 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-sys\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045878 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-accelerators-collector-config\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-wtmp\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.045943 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045938 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1bf31385-2f55-475c-849b-f4ceb3ac894b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.046350 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.045985 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.046350 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.046026 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s98g\" (UniqueName: \"kubernetes.io/projected/817e7306-d6d2-47c0-895a-14dc5408d1d6-kube-api-access-5s98g\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147259 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147219 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-root\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147271 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-root\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147350 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147395 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-sys\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147420 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-accelerators-collector-config\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147448 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-wtmp\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:54.147454 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 20 19:10:54.147491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147477 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1bf31385-2f55-475c-849b-f4ceb3ac894b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:54.147514 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-tls podName:1bf31385-2f55-475c-849b-f4ceb3ac894b nodeName:}" failed. No retries permitted until 2026-04-20 19:10:54.647494387 +0000 UTC m=+179.560990419 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-2sxrw" (UID: "1bf31385-2f55-475c-849b-f4ceb3ac894b") : secret "kube-state-metrics-tls" not found Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147572 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147620 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s98g\" (UniqueName: \"kubernetes.io/projected/817e7306-d6d2-47c0-895a-14dc5408d1d6-kube-api-access-5s98g\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147655 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2h6l\" (UniqueName: \"kubernetes.io/projected/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-api-access-z2h6l\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147691 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-textfile\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/817e7306-d6d2-47c0-895a-14dc5408d1d6-metrics-client-ca\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147748 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf31385-2f55-475c-849b-f4ceb3ac894b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.147929 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.147816 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-sys\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.148371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.148093 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1bf31385-2f55-475c-849b-f4ceb3ac894b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.148371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.148249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-wtmp\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.148479 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.148451 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/817e7306-d6d2-47c0-895a-14dc5408d1d6-metrics-client-ca\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.148563 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:54.148548 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:10:54.148632 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.148561 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.148632 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:54.148615 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls podName:817e7306-d6d2-47c0-895a-14dc5408d1d6 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:54.648590157 +0000 UTC m=+179.562086189 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls") pod "node-exporter-zglz9" (UID: "817e7306-d6d2-47c0-895a-14dc5408d1d6") : secret "node-exporter-tls" not found Apr 20 19:10:54.148917 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.148891 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-textfile\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.148983 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.148957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-accelerators-collector-config\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.149326 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.149295 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bf31385-2f55-475c-849b-f4ceb3ac894b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.150426 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.150405 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.151309 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.151287 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.156923 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.156881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2h6l\" (UniqueName: \"kubernetes.io/projected/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-api-access-z2h6l\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.157984 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.157966 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s98g\" (UniqueName: \"kubernetes.io/projected/817e7306-d6d2-47c0-895a-14dc5408d1d6-kube-api-access-5s98g\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.652560 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.652497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:54.652756 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.652647 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.652826 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:54.652764 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:10:54.652930 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:54.652912 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls podName:817e7306-d6d2-47c0-895a-14dc5408d1d6 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.652884536 +0000 UTC m=+180.566380589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls") pod "node-exporter-zglz9" (UID: "817e7306-d6d2-47c0-895a-14dc5408d1d6") : secret "node-exporter-tls" not found Apr 20 19:10:54.655516 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.655491 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1bf31385-2f55-475c-849b-f4ceb3ac894b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2sxrw\" (UID: \"1bf31385-2f55-475c-849b-f4ceb3ac894b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:54.909471 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:54.909387 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" Apr 20 19:10:55.093450 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.093417 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:10:55.097570 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.097543 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.102473 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.102701 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.102743 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.102907 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.103083 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.103175 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.103305 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.103361 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 19:10:55.103726 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.103610 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2z7mp\"" Apr 20 19:10:55.107162 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.106816 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 19:10:55.111997 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.111030 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:10:55.119377 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.119332 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2sxrw"] Apr 20 19:10:55.123365 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:55.123326 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bf31385_2f55_475c_849b_f4ceb3ac894b.slice/crio-640ba5e116fc5702829ddb4a75d68b7e8503c07d54e7c42589c168269a68578c WatchSource:0}: Error finding container 640ba5e116fc5702829ddb4a75d68b7e8503c07d54e7c42589c168269a68578c: Status 404 returned error can't find the container with id 640ba5e116fc5702829ddb4a75d68b7e8503c07d54e7c42589c168269a68578c Apr 20 19:10:55.158566 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158566 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158524 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-web-config\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158759 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158759 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158699 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158759 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158724 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158921 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv5x8\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-kube-api-access-kv5x8\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158921 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158796 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158921 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158830 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158921 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158864 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.158921 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158893 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.159191 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158955 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-out\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.159191 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.158978 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.159191 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.159031 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.194671 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.194628 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" event={"ID":"1bf31385-2f55-475c-849b-f4ceb3ac894b","Type":"ContainerStarted","Data":"640ba5e116fc5702829ddb4a75d68b7e8503c07d54e7c42589c168269a68578c"} Apr 20 19:10:55.259898 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.259862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.259898 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.259907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260179 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.259934 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260179 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.259970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv5x8\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-kube-api-access-kv5x8\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260179 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260179 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260179 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260179 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260103 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260183 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-out\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:55.260217 2571 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:55.260287 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls podName:bb88d695-a7a4-44c7-a5d1-1399aa607235 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.760264727 +0000 UTC m=+180.673760779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235") : secret "alertmanager-main-tls" not found Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-web-config\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.260353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.260488 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:10:55.260369 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle podName:bb88d695-a7a4-44c7-a5d1-1399aa607235 nodeName:}" failed. No retries permitted until 2026-04-20 19:10:55.760355721 +0000 UTC m=+180.673851770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235") : configmap references non-existent config key: ca-bundle.crt Apr 20 19:10:55.261577 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.261554 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.264071 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.264047 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.264214 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.264191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.264619 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.264569 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-out\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.266362 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.266323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.266962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.266523 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.267383 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.267342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.267756 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.267684 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-web-config\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.268049 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.267996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-volume\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.269336 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.269310 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv5x8\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-kube-api-access-kv5x8\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.665053 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.665026 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:55.667899 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.667871 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/817e7306-d6d2-47c0-895a-14dc5408d1d6-node-exporter-tls\") pod \"node-exporter-zglz9\" (UID: \"817e7306-d6d2-47c0-895a-14dc5408d1d6\") " pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:55.765578 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.765526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.765762 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.765591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.767026 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.766997 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.768715 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.768692 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:55.830189 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.830152 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lszj4\"" Apr 20 19:10:55.837031 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:55.836999 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zglz9" Apr 20 19:10:55.851218 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:55.851184 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817e7306_d6d2_47c0_895a_14dc5408d1d6.slice/crio-37eba0da4d47d2d68a1e6233cda9ae854fb71fa2000eca564e4281782f034a73 WatchSource:0}: Error finding container 37eba0da4d47d2d68a1e6233cda9ae854fb71fa2000eca564e4281782f034a73: Status 404 returned error can't find the container with id 37eba0da4d47d2d68a1e6233cda9ae854fb71fa2000eca564e4281782f034a73 Apr 20 19:10:56.015594 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:56.015503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2z7mp\"" Apr 20 19:10:56.023843 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:56.023811 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:10:56.185084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:56.185033 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:10:56.199830 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:56.199795 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zglz9" event={"ID":"817e7306-d6d2-47c0-895a-14dc5408d1d6","Type":"ContainerStarted","Data":"37eba0da4d47d2d68a1e6233cda9ae854fb71fa2000eca564e4281782f034a73"} Apr 20 19:10:56.429393 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:10:56.429355 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb88d695_a7a4_44c7_a5d1_1399aa607235.slice/crio-ace7526719f374cb82df7dfce6bc092a22f67b20a597bb2f70b42854f383be4a WatchSource:0}: Error finding container ace7526719f374cb82df7dfce6bc092a22f67b20a597bb2f70b42854f383be4a: Status 404 returned error can't find the container with id ace7526719f374cb82df7dfce6bc092a22f67b20a597bb2f70b42854f383be4a Apr 20 19:10:57.207261 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.207117 2571 patch_prober.go:28] interesting pod/image-registry-9d6cb88f9-f5dkx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 19:10:57.207261 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.207179 2571 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:10:57.207755 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.207437 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" event={"ID":"1bf31385-2f55-475c-849b-f4ceb3ac894b","Type":"ContainerStarted","Data":"a9596ad5a908d23001ecb5d3e04c92569e48f488274c630bf61c7245d7ea4c0d"} Apr 20 19:10:57.207755 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.207468 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" event={"ID":"1bf31385-2f55-475c-849b-f4ceb3ac894b","Type":"ContainerStarted","Data":"793a9de35da251c6add67ab06b3c35c74ac4f3256a69e728253a19581da1d7a5"} Apr 20 19:10:57.207755 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.207483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" event={"ID":"1bf31385-2f55-475c-849b-f4ceb3ac894b","Type":"ContainerStarted","Data":"641d0f3cfbf2345d7113f5bd77ad67c99942129a7db875fb055562f38e5affa3"} Apr 20 19:10:57.211443 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.210621 2571 generic.go:358] "Generic (PLEG): container finished" podID="817e7306-d6d2-47c0-895a-14dc5408d1d6" containerID="2f321744291f07137342ef639620136bd44ae9825ecd5f7c42216d6d74ba3fbc" exitCode=0 Apr 20 19:10:57.211443 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.210729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zglz9" event={"ID":"817e7306-d6d2-47c0-895a-14dc5408d1d6","Type":"ContainerDied","Data":"2f321744291f07137342ef639620136bd44ae9825ecd5f7c42216d6d74ba3fbc"} Apr 20 19:10:57.213038 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.213012 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"ace7526719f374cb82df7dfce6bc092a22f67b20a597bb2f70b42854f383be4a"} Apr 20 19:10:57.229414 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:57.229365 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-2sxrw" podStartSLOduration=2.890043021 podStartE2EDuration="4.229347913s" podCreationTimestamp="2026-04-20 19:10:53 +0000 UTC" firstStartedPulling="2026-04-20 19:10:55.125655518 +0000 UTC m=+180.039151555" lastFinishedPulling="2026-04-20 19:10:56.464960402 +0000 UTC m=+181.378456447" observedRunningTime="2026-04-20 19:10:57.22698523 +0000 UTC m=+182.140481313" watchObservedRunningTime="2026-04-20 19:10:57.229347913 +0000 UTC m=+182.142843966" Apr 20 19:10:58.217784 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.217749 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zglz9" event={"ID":"817e7306-d6d2-47c0-895a-14dc5408d1d6","Type":"ContainerStarted","Data":"e1bb1115038982e4d385cc1c6cf3da7a6113813e89a02e69cd5432a1d26cc97c"} Apr 20 19:10:58.293025 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.292995 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-8659f8b99f-8q6qm"] Apr 20 19:10:58.296204 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.296179 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.300121 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.300068 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 19:10:58.300292 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.300276 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 19:10:58.300376 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.300348 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:10:58.300454 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.300420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 19:10:58.300454 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.300275 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-869gae9t97u7\"" Apr 20 19:10:58.300638 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.300614 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m8mt9\"" Apr 20 19:10:58.307603 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.307583 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8659f8b99f-8q6qm"] Apr 20 19:10:58.390786 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.390749 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4e7f8208-d67f-45c0-92a0-ee2371caa82c-metrics-server-audit-profiles\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.390964 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.390811 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4e7f8208-d67f-45c0-92a0-ee2371caa82c-audit-log\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.390964 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.390829 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-secret-metrics-server-tls\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.390964 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.390931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-client-ca-bundle\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.391100 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.390977 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-secret-metrics-server-client-certs\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.391100 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.391016 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8txz\" (UniqueName: \"kubernetes.io/projected/4e7f8208-d67f-45c0-92a0-ee2371caa82c-kube-api-access-s8txz\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.395745 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.391037 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7f8208-d67f-45c0-92a0-ee2371caa82c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497070 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.496968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-client-ca-bundle\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497070 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-secret-metrics-server-client-certs\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497327 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497090 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8txz\" (UniqueName: \"kubernetes.io/projected/4e7f8208-d67f-45c0-92a0-ee2371caa82c-kube-api-access-s8txz\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497327 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497150 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7f8208-d67f-45c0-92a0-ee2371caa82c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497327 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497244 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4e7f8208-d67f-45c0-92a0-ee2371caa82c-metrics-server-audit-profiles\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497327 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4e7f8208-d67f-45c0-92a0-ee2371caa82c-audit-log\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497327 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497319 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-secret-metrics-server-tls\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.497817 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497785 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4e7f8208-d67f-45c0-92a0-ee2371caa82c-audit-log\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.498032 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.497980 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e7f8208-d67f-45c0-92a0-ee2371caa82c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.498415 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.498372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4e7f8208-d67f-45c0-92a0-ee2371caa82c-metrics-server-audit-profiles\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.499985 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.499957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-secret-metrics-server-client-certs\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.500128 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.500092 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-client-ca-bundle\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.500256 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.500227 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4e7f8208-d67f-45c0-92a0-ee2371caa82c-secret-metrics-server-tls\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.506323 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.506304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8txz\" (UniqueName: \"kubernetes.io/projected/4e7f8208-d67f-45c0-92a0-ee2371caa82c-kube-api-access-s8txz\") pod \"metrics-server-8659f8b99f-8q6qm\" (UID: \"4e7f8208-d67f-45c0-92a0-ee2371caa82c\") " pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.608008 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.607960 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:10:58.745317 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.745282 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4"] Apr 20 19:10:58.748556 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.748492 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:10:58.751251 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.751226 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-887cf\"" Apr 20 19:10:58.751390 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.751265 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 19:10:58.755833 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.755804 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4"] Apr 20 19:10:58.800155 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.800096 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5065e954-b3de-4d40-9814-b493ad51776a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hffb4\" (UID: \"5065e954-b3de-4d40-9814-b493ad51776a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:10:58.901074 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.901043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5065e954-b3de-4d40-9814-b493ad51776a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hffb4\" (UID: \"5065e954-b3de-4d40-9814-b493ad51776a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:10:58.904430 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:58.904407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5065e954-b3de-4d40-9814-b493ad51776a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hffb4\" (UID: \"5065e954-b3de-4d40-9814-b493ad51776a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:10:59.061550 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:59.061447 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:10:59.127843 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:10:59.127814 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:11:04.758430 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:04.758351 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4"] Apr 20 19:11:04.792881 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:04.792846 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-8659f8b99f-8q6qm"] Apr 20 19:11:04.799152 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:11:04.799103 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7f8208_d67f_45c0_92a0_ee2371caa82c.slice/crio-af2fe62e9be53f5a895cdb805022540d6f616e0463bdfef1afa7bc7bc40de2b3 WatchSource:0}: Error finding container af2fe62e9be53f5a895cdb805022540d6f616e0463bdfef1afa7bc7bc40de2b3: Status 404 returned error can't find the container with id af2fe62e9be53f5a895cdb805022540d6f616e0463bdfef1afa7bc7bc40de2b3 Apr 20 19:11:04.799352 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:11:04.799330 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5065e954_b3de_4d40_9814_b493ad51776a.slice/crio-041374b230f1e2f0b3e0fb3ed3a81adeac8dad89a4d69465f8d740636c883472 WatchSource:0}: Error finding container 041374b230f1e2f0b3e0fb3ed3a81adeac8dad89a4d69465f8d740636c883472: Status 404 returned error can't find the container with id 041374b230f1e2f0b3e0fb3ed3a81adeac8dad89a4d69465f8d740636c883472 Apr 20 19:11:05.243862 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.243814 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zglz9" event={"ID":"817e7306-d6d2-47c0-895a-14dc5408d1d6","Type":"ContainerStarted","Data":"20177790a82366dcf0abdb4c729caec0d6051e13eb9e92459d9adcb8a823fac9"} Apr 20 19:11:05.245173 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.245133 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" event={"ID":"4e7f8208-d67f-45c0-92a0-ee2371caa82c","Type":"ContainerStarted","Data":"af2fe62e9be53f5a895cdb805022540d6f616e0463bdfef1afa7bc7bc40de2b3"} Apr 20 19:11:05.246783 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.246746 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6p5l6" event={"ID":"b945cabb-3adb-4bb8-868d-60d1d730cf72","Type":"ContainerStarted","Data":"2c5674f78d7a7fbfb6edf29a0a33d4c812f7b009ef9afb84586ea200b751f7ed"} Apr 20 19:11:05.246972 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.246949 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:11:05.248420 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.248395 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" exitCode=0 Apr 20 19:11:05.248530 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.248484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} Apr 20 19:11:05.249708 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.249644 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" event={"ID":"5065e954-b3de-4d40-9814-b493ad51776a","Type":"ContainerStarted","Data":"041374b230f1e2f0b3e0fb3ed3a81adeac8dad89a4d69465f8d740636c883472"} Apr 20 19:11:05.261259 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.261225 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-6p5l6" Apr 20 19:11:05.266518 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.266461 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zglz9" podStartSLOduration=11.153915256 podStartE2EDuration="12.266447264s" podCreationTimestamp="2026-04-20 19:10:53 +0000 UTC" firstStartedPulling="2026-04-20 19:10:55.853658785 +0000 UTC m=+180.767154822" lastFinishedPulling="2026-04-20 19:10:56.966190793 +0000 UTC m=+181.879686830" observedRunningTime="2026-04-20 19:11:05.264393139 +0000 UTC m=+190.177889216" watchObservedRunningTime="2026-04-20 19:11:05.266447264 +0000 UTC m=+190.179943329" Apr 20 19:11:05.282070 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:05.282018 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-6p5l6" podStartSLOduration=1.893130296 podStartE2EDuration="19.282002043s" podCreationTimestamp="2026-04-20 19:10:46 +0000 UTC" firstStartedPulling="2026-04-20 19:10:47.27879971 +0000 UTC m=+172.192295739" lastFinishedPulling="2026-04-20 19:11:04.667671454 +0000 UTC m=+189.581167486" observedRunningTime="2026-04-20 19:11:05.280796255 +0000 UTC m=+190.194292319" watchObservedRunningTime="2026-04-20 19:11:05.282002043 +0000 UTC m=+190.195498095" Apr 20 19:11:08.267559 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.267516 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" event={"ID":"4e7f8208-d67f-45c0-92a0-ee2371caa82c","Type":"ContainerStarted","Data":"94e63415fcebc894c18d3e8a1f763836bbf797658acd98459fd4d599705e1064"} Apr 20 19:11:08.270138 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.270066 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} Apr 20 19:11:08.270256 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.270150 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} Apr 20 19:11:08.272503 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.272476 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" event={"ID":"5065e954-b3de-4d40-9814-b493ad51776a","Type":"ContainerStarted","Data":"4f873e51bd2592e00c53c9ae8defecbf4ee9390324b1849dd61d8477cd6b9bd5"} Apr 20 19:11:08.272952 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.272931 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:11:08.279186 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.279165 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" Apr 20 19:11:08.286212 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.286167 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" podStartSLOduration=7.135660836 podStartE2EDuration="10.28615408s" podCreationTimestamp="2026-04-20 19:10:58 +0000 UTC" firstStartedPulling="2026-04-20 19:11:04.80117685 +0000 UTC m=+189.714672897" lastFinishedPulling="2026-04-20 19:11:07.951670096 +0000 UTC m=+192.865166141" observedRunningTime="2026-04-20 19:11:08.283618747 +0000 UTC m=+193.197114799" watchObservedRunningTime="2026-04-20 19:11:08.28615408 +0000 UTC m=+193.199650132" Apr 20 19:11:08.297659 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:08.297614 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hffb4" podStartSLOduration=7.14646649 podStartE2EDuration="10.297602486s" podCreationTimestamp="2026-04-20 19:10:58 +0000 UTC" firstStartedPulling="2026-04-20 19:11:04.801059894 +0000 UTC m=+189.714555923" lastFinishedPulling="2026-04-20 19:11:07.952195889 +0000 UTC m=+192.865691919" observedRunningTime="2026-04-20 19:11:08.297086996 +0000 UTC m=+193.210583047" watchObservedRunningTime="2026-04-20 19:11:08.297602486 +0000 UTC m=+193.211098538" Apr 20 19:11:09.282160 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:09.282122 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} Apr 20 19:11:09.282160 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:09.282164 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} Apr 20 19:11:09.282739 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:09.282180 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} Apr 20 19:11:09.336809 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:09.336771 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9d6cb88f9-f5dkx"] Apr 20 19:11:11.302400 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:11.302359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerStarted","Data":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} Apr 20 19:11:11.336646 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:11.336577 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.604863752 podStartE2EDuration="16.336559797s" podCreationTimestamp="2026-04-20 19:10:55 +0000 UTC" firstStartedPulling="2026-04-20 19:10:56.43158114 +0000 UTC m=+181.345077170" lastFinishedPulling="2026-04-20 19:11:10.163277183 +0000 UTC m=+195.076773215" observedRunningTime="2026-04-20 19:11:11.335098161 +0000 UTC m=+196.248594216" watchObservedRunningTime="2026-04-20 19:11:11.336559797 +0000 UTC m=+196.250055848" Apr 20 19:11:18.608137 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:18.608070 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:11:18.608137 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:18.608144 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:11:34.365451 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.365384 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" containerName="registry" containerID="cri-o://b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e" gracePeriod=30 Apr 20 19:11:34.614054 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.614033 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:11:34.735218 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735187 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65md9\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-kube-api-access-65md9\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735218 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735224 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735453 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735258 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-image-registry-private-configuration\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735453 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735282 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-bound-sa-token\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735453 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735428 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-installation-pull-secrets\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735621 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735483 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-trusted-ca\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735621 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735548 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-ca-trust-extracted\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.735621 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735602 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-certificates\") pod \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\" (UID: \"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19\") " Apr 20 19:11:34.736448 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.735959 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:34.736448 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.736370 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:11:34.737925 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.737854 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-kube-api-access-65md9" (OuterVolumeSpecName: "kube-api-access-65md9") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "kube-api-access-65md9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:34.737925 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.737863 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:34.738141 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.738013 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:34.738141 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.738022 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:11:34.738141 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.738037 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:11:34.744549 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.744528 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" (UID: "fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:11:34.836718 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836677 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-ca-trust-extracted\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836718 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836709 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-certificates\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836718 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836722 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-65md9\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-kube-api-access-65md9\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836736 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-registry-tls\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836749 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-image-registry-private-configuration\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836763 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-bound-sa-token\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836775 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-installation-pull-secrets\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:34.836953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:34.836787 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19-trusted-ca\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:11:35.378979 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.378947 2571 generic.go:358] "Generic (PLEG): container finished" podID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" containerID="b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e" exitCode=0 Apr 20 19:11:35.379413 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.379008 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" Apr 20 19:11:35.379413 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.379031 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" event={"ID":"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19","Type":"ContainerDied","Data":"b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e"} Apr 20 19:11:35.379413 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.379070 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9d6cb88f9-f5dkx" event={"ID":"fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19","Type":"ContainerDied","Data":"19b5ce152254f8bdaa6b950e0acd16ae83682ab48b9afbeb5f93c526141de77d"} Apr 20 19:11:35.379413 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.379086 2571 scope.go:117] "RemoveContainer" containerID="b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e" Apr 20 19:11:35.387618 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.387601 2571 scope.go:117] "RemoveContainer" containerID="b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e" Apr 20 19:11:35.387864 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:11:35.387846 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e\": container with ID starting with b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e not found: ID does not exist" containerID="b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e" Apr 20 19:11:35.387907 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.387871 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e"} err="failed to get container status \"b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e\": rpc error: code = NotFound desc = could not find container \"b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e\": container with ID starting with b8c16a3a808358a6715cd29d88c8b25d85d00be634cfdd94675757dd57c0d32e not found: ID does not exist" Apr 20 19:11:35.404081 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.404051 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9d6cb88f9-f5dkx"] Apr 20 19:11:35.407385 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.407363 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9d6cb88f9-f5dkx"] Apr 20 19:11:35.691489 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:35.691447 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" path="/var/lib/kubelet/pods/fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19/volumes" Apr 20 19:11:37.386610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:37.386577 2571 generic.go:358] "Generic (PLEG): container finished" podID="4cf6648c-0d3c-45e5-877a-2ef099dd0653" containerID="340ceec1527b12f6b601a950b7b984cbebc4a884e2c6aa2532277d3c91926505" exitCode=0 Apr 20 19:11:37.387080 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:37.386637 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-slmw8" event={"ID":"4cf6648c-0d3c-45e5-877a-2ef099dd0653","Type":"ContainerDied","Data":"340ceec1527b12f6b601a950b7b984cbebc4a884e2c6aa2532277d3c91926505"} Apr 20 19:11:37.387080 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:37.387043 2571 scope.go:117] "RemoveContainer" containerID="340ceec1527b12f6b601a950b7b984cbebc4a884e2c6aa2532277d3c91926505" Apr 20 19:11:38.391326 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:38.391287 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-slmw8" event={"ID":"4cf6648c-0d3c-45e5-877a-2ef099dd0653","Type":"ContainerStarted","Data":"a0bd53f92024ccf9d6a1b8a8fd4feb13aa3e1b1ea010f7dba91361a15999eb44"} Apr 20 19:11:38.613668 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:38.613638 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:11:38.617621 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:11:38.617598 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-8659f8b99f-8q6qm" Apr 20 19:12:07.618295 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:07.618258 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:12:07.620590 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:07.620564 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc31ab16-2946-4d9a-baee-c02a00b73aae-metrics-certs\") pod \"network-metrics-daemon-ff5pq\" (UID: \"cc31ab16-2946-4d9a-baee-c02a00b73aae\") " pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:12:07.791485 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:07.791452 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bdmvj\"" Apr 20 19:12:07.799362 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:07.799334 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ff5pq" Apr 20 19:12:07.937089 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:07.937065 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ff5pq"] Apr 20 19:12:07.939877 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:12:07.939848 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc31ab16_2946_4d9a_baee_c02a00b73aae.slice/crio-446c884727378d56153d367c8fc1f09090ea16d3607099959a0f24ea1a90134e WatchSource:0}: Error finding container 446c884727378d56153d367c8fc1f09090ea16d3607099959a0f24ea1a90134e: Status 404 returned error can't find the container with id 446c884727378d56153d367c8fc1f09090ea16d3607099959a0f24ea1a90134e Apr 20 19:12:08.477353 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:08.477313 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff5pq" event={"ID":"cc31ab16-2946-4d9a-baee-c02a00b73aae","Type":"ContainerStarted","Data":"446c884727378d56153d367c8fc1f09090ea16d3607099959a0f24ea1a90134e"} Apr 20 19:12:09.482689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:09.482649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff5pq" event={"ID":"cc31ab16-2946-4d9a-baee-c02a00b73aae","Type":"ContainerStarted","Data":"3719205a43385b826d0e2562f686c5ca3327ab97355697abca17eca265fdbc19"} Apr 20 19:12:09.482689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:09.482690 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ff5pq" event={"ID":"cc31ab16-2946-4d9a-baee-c02a00b73aae","Type":"ContainerStarted","Data":"5a46c5af1a812e631d52d256ecf020df2eaa846d6e30245e76f8819445f24166"} Apr 20 19:12:09.499431 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:09.499386 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ff5pq" podStartSLOduration=253.598939287 podStartE2EDuration="4m14.499370566s" podCreationTimestamp="2026-04-20 19:07:55 +0000 UTC" firstStartedPulling="2026-04-20 19:12:07.942526022 +0000 UTC m=+252.856022055" lastFinishedPulling="2026-04-20 19:12:08.842957304 +0000 UTC m=+253.756453334" observedRunningTime="2026-04-20 19:12:09.498268177 +0000 UTC m=+254.411764247" watchObservedRunningTime="2026-04-20 19:12:09.499370566 +0000 UTC m=+254.412866619" Apr 20 19:12:14.212547 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.212464 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:12:14.212955 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.212913 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="alertmanager" containerID="cri-o://656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" gracePeriod=120 Apr 20 19:12:14.213045 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.213013 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-web" containerID="cri-o://aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" gracePeriod=120 Apr 20 19:12:14.213286 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.213013 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-metric" containerID="cri-o://58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" gracePeriod=120 Apr 20 19:12:14.213387 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.213036 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy" containerID="cri-o://0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" gracePeriod=120 Apr 20 19:12:14.213387 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.213067 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="prom-label-proxy" containerID="cri-o://d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" gracePeriod=120 Apr 20 19:12:14.213813 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.213425 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="config-reloader" containerID="cri-o://6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" gracePeriod=120 Apr 20 19:12:14.443482 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.443459 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.479970 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.479879 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv5x8\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-kube-api-access-kv5x8\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.479970 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.479937 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-metrics-client-ca\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.479970 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.479965 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480286 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480009 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480370 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480347 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-web\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480426 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480400 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-volume\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480377 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:14.480478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480437 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-cluster-tls-config\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480478 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480468 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-main-db\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480628 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480507 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480628 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480531 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480628 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480580 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-web-config\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480628 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480601 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-out\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480840 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480647 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-tls-assets\") pod \"bb88d695-a7a4-44c7-a5d1-1399aa607235\" (UID: \"bb88d695-a7a4-44c7-a5d1-1399aa607235\") " Apr 20 19:12:14.480918 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.480898 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-metrics-client-ca\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.482214 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.481635 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:12:14.482214 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.481961 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:12:14.483261 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.483222 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:12:14.483785 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.483746 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.484879 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.484840 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-kube-api-access-kv5x8" (OuterVolumeSpecName: "kube-api-access-kv5x8") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "kube-api-access-kv5x8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:12:14.485002 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.484887 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.485073 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.484984 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.485369 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.485296 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-out" (OuterVolumeSpecName: "config-out") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:12:14.486010 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.485971 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.486089 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.486030 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.491874 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.491848 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.498014 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.497990 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-web-config" (OuterVolumeSpecName: "web-config") pod "bb88d695-a7a4-44c7-a5d1-1399aa607235" (UID: "bb88d695-a7a4-44c7-a5d1-1399aa607235"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:12:14.503673 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503648 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" exitCode=0 Apr 20 19:12:14.503673 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503671 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" exitCode=0 Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503677 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" exitCode=0 Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503683 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" exitCode=0 Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503688 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" exitCode=0 Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503693 2571 generic.go:358] "Generic (PLEG): container finished" podID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" exitCode=0 Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503729 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503773 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.503806 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503787 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.504183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503772 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} Apr 20 19:12:14.504183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503920 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} Apr 20 19:12:14.504183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.503941 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} Apr 20 19:12:14.504183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.504007 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} Apr 20 19:12:14.504183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.504020 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} Apr 20 19:12:14.504183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.504029 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bb88d695-a7a4-44c7-a5d1-1399aa607235","Type":"ContainerDied","Data":"ace7526719f374cb82df7dfce6bc092a22f67b20a597bb2f70b42854f383be4a"} Apr 20 19:12:14.511248 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.511224 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.518156 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.518101 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.525074 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.525054 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.530490 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.530467 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:12:14.532395 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.532374 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.538378 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.538357 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:12:14.539339 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.539323 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.545564 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.545549 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.551708 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.551688 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.551954 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.551936 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.552005 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.551964 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} err="failed to get container status \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" Apr 20 19:12:14.552005 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.551985 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.552269 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.552245 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.552373 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.552276 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} err="failed to get container status \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" Apr 20 19:12:14.552373 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.552298 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.552742 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.552717 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.552820 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.552749 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} err="failed to get container status \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" Apr 20 19:12:14.552820 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.552765 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.553022 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.553003 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.553078 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553030 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} err="failed to get container status \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" Apr 20 19:12:14.553078 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553052 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.553313 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.553296 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.553357 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553320 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} err="failed to get container status \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" Apr 20 19:12:14.553357 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553335 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.553540 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.553523 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.553598 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553547 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} err="failed to get container status \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" Apr 20 19:12:14.553598 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553570 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.553772 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:12:14.553758 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.553811 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553776 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} err="failed to get container status \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" Apr 20 19:12:14.553811 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553789 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.553962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553946 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} err="failed to get container status \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" Apr 20 19:12:14.554007 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.553962 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.554143 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554127 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} err="failed to get container status \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" Apr 20 19:12:14.554196 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554144 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.554306 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554290 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} err="failed to get container status \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" Apr 20 19:12:14.554354 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554306 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.554467 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554454 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} err="failed to get container status \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" Apr 20 19:12:14.554513 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554467 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.554625 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554611 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} err="failed to get container status \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" Apr 20 19:12:14.554671 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554625 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.554817 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554799 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} err="failed to get container status \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" Apr 20 19:12:14.554884 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554820 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.555003 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.554988 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} err="failed to get container status \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" Apr 20 19:12:14.555048 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555003 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.555212 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555190 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} err="failed to get container status \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" Apr 20 19:12:14.555278 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555214 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.555447 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555429 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} err="failed to get container status \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" Apr 20 19:12:14.555493 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555449 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.555648 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555631 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} err="failed to get container status \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" Apr 20 19:12:14.555713 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555651 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.555834 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555819 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} err="failed to get container status \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" Apr 20 19:12:14.555881 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555835 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.556012 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.555997 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} err="failed to get container status \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" Apr 20 19:12:14.556053 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556013 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.556192 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556174 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} err="failed to get container status \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" Apr 20 19:12:14.556260 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556193 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.556400 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556382 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} err="failed to get container status \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" Apr 20 19:12:14.556436 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556401 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.556575 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556558 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} err="failed to get container status \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" Apr 20 19:12:14.556630 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556576 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.556767 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556749 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} err="failed to get container status \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" Apr 20 19:12:14.556811 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556769 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.556955 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556938 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} err="failed to get container status \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" Apr 20 19:12:14.556996 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.556956 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.557192 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557177 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} err="failed to get container status \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" Apr 20 19:12:14.557238 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557192 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.557352 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557332 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} err="failed to get container status \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" Apr 20 19:12:14.557404 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557354 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.557592 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557576 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} err="failed to get container status \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" Apr 20 19:12:14.557592 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557591 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.557773 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557755 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} err="failed to get container status \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" Apr 20 19:12:14.557819 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557773 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.557973 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557957 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} err="failed to get container status \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" Apr 20 19:12:14.558023 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.557974 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.558177 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558157 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} err="failed to get container status \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" Apr 20 19:12:14.558228 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558178 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.558394 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558379 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} err="failed to get container status \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" Apr 20 19:12:14.558442 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558394 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.558563 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558549 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} err="failed to get container status \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" Apr 20 19:12:14.558615 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558562 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.558736 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558722 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} err="failed to get container status \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" Apr 20 19:12:14.558779 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558736 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.558944 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558918 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} err="failed to get container status \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" Apr 20 19:12:14.558992 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.558944 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.559153 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559125 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} err="failed to get container status \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" Apr 20 19:12:14.559153 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559148 2571 scope.go:117] "RemoveContainer" containerID="d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c" Apr 20 19:12:14.559347 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559332 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c"} err="failed to get container status \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": rpc error: code = NotFound desc = could not find container \"d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c\": container with ID starting with d2cf63c5012af4f8d4fd37b661893ef7bbf0bc789f67d32769572e255732de3c not found: ID does not exist" Apr 20 19:12:14.559396 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559347 2571 scope.go:117] "RemoveContainer" containerID="58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2" Apr 20 19:12:14.559556 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559539 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2"} err="failed to get container status \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": rpc error: code = NotFound desc = could not find container \"58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2\": container with ID starting with 58ec7304d65b890271bf9dae076330d63ff066083a7ff70810c2fcc3946da7e2 not found: ID does not exist" Apr 20 19:12:14.559556 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559555 2571 scope.go:117] "RemoveContainer" containerID="0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f" Apr 20 19:12:14.559728 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559711 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f"} err="failed to get container status \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": rpc error: code = NotFound desc = could not find container \"0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f\": container with ID starting with 0094f16c3e2d48bc15321e068a8dc30b1be53665c096f59b6e66578d0c025b4f not found: ID does not exist" Apr 20 19:12:14.559799 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559730 2571 scope.go:117] "RemoveContainer" containerID="aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e" Apr 20 19:12:14.559908 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559891 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e"} err="failed to get container status \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": rpc error: code = NotFound desc = could not find container \"aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e\": container with ID starting with aa42a9a9427a60c0a64b1d66b376333acdc7702cb393bbc8d379510138807c6e not found: ID does not exist" Apr 20 19:12:14.559967 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.559909 2571 scope.go:117] "RemoveContainer" containerID="6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6" Apr 20 19:12:14.560139 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.560122 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6"} err="failed to get container status \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": rpc error: code = NotFound desc = could not find container \"6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6\": container with ID starting with 6d180dadf161a7a173285cafe6868275927c9908ada902bc0a0f8548871763b6 not found: ID does not exist" Apr 20 19:12:14.560209 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.560140 2571 scope.go:117] "RemoveContainer" containerID="656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd" Apr 20 19:12:14.560365 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.560348 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd"} err="failed to get container status \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": rpc error: code = NotFound desc = could not find container \"656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd\": container with ID starting with 656b1082a72e3f67efc0005f545f7c4260ad7e27899c605213c46726905cc3dd not found: ID does not exist" Apr 20 19:12:14.560405 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.560366 2571 scope.go:117] "RemoveContainer" containerID="2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0" Apr 20 19:12:14.560566 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.560547 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0"} err="failed to get container status \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": rpc error: code = NotFound desc = could not find container \"2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0\": container with ID starting with 2db25dea45022b0fcdb2f6a750bc688d9f2963253782d8af7b6817f62d9ec9c0 not found: ID does not exist" Apr 20 19:12:14.564900 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.564880 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:12:14.565198 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565183 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-web" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565201 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-web" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565215 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="alertmanager" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565221 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="alertmanager" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565230 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565235 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565243 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="config-reloader" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565248 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="config-reloader" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565257 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-metric" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565262 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-metric" Apr 20 19:12:14.565264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565269 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="prom-label-proxy" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565274 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="prom-label-proxy" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565287 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" containerName="registry" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565293 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" containerName="registry" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565300 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="init-config-reloader" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565306 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="init-config-reloader" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565360 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="prom-label-proxy" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565372 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-web" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565379 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565386 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd54e0b0-d5ef-4e48-92e1-1f6a9c491f19" containerName="registry" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565393 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="config-reloader" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565398 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="kube-rbac-proxy-metric" Apr 20 19:12:14.565610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.565405 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" containerName="alertmanager" Apr 20 19:12:14.568716 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.568700 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571601 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571601 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571633 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571660 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2z7mp\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571669 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571699 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571633 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 19:12:14.571770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571756 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 19:12:14.572228 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.571828 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 19:12:14.576709 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.576691 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.582821 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmktm\" (UniqueName: \"kubernetes.io/projected/ca542e87-c56f-4eb0-b002-010f69fe987c-kube-api-access-zmktm\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.582875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-web-config\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.582911 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca542e87-c56f-4eb0-b002-010f69fe987c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.582938 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.582965 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca542e87-c56f-4eb0-b002-010f69fe987c-config-out\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583014 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583038 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca542e87-c56f-4eb0-b002-010f69fe987c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583061 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca542e87-c56f-4eb0-b002-010f69fe987c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583094 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-config-volume\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583142 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca542e87-c56f-4eb0-b002-010f69fe987c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583168 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583199 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583305 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583320 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-volume\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583333 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-cluster-tls-config\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.583610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583347 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-main-db\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583360 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb88d695-a7a4-44c7-a5d1-1399aa607235-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583375 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583389 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-web-config\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583401 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bb88d695-a7a4-44c7-a5d1-1399aa607235-config-out\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583413 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-tls-assets\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583425 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kv5x8\" (UniqueName: \"kubernetes.io/projected/bb88d695-a7a4-44c7-a5d1-1399aa607235-kube-api-access-kv5x8\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583439 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-main-tls\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.583453 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bb88d695-a7a4-44c7-a5d1-1399aa607235-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:12:14.585207 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.584833 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:12:14.684231 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684231 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684234 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca542e87-c56f-4eb0-b002-010f69fe987c-config-out\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684484 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684380 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684484 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca542e87-c56f-4eb0-b002-010f69fe987c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684484 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684428 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca542e87-c56f-4eb0-b002-010f69fe987c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684484 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684453 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-config-volume\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684484 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca542e87-c56f-4eb0-b002-010f69fe987c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684745 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684745 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684662 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684745 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684888 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmktm\" (UniqueName: \"kubernetes.io/projected/ca542e87-c56f-4eb0-b002-010f69fe987c-kube-api-access-zmktm\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.684888 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ca542e87-c56f-4eb0-b002-010f69fe987c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.685410 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.685195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ca542e87-c56f-4eb0-b002-010f69fe987c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.685410 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.684808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-web-config\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.685410 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.685273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca542e87-c56f-4eb0-b002-010f69fe987c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.685410 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.685315 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca542e87-c56f-4eb0-b002-010f69fe987c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.687014 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.686982 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ca542e87-c56f-4eb0-b002-010f69fe987c-config-out\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.687474 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.687268 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.687474 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.687403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.687687 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.687663 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.687778 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.687760 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ca542e87-c56f-4eb0-b002-010f69fe987c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.687949 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.687931 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-web-config\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.688018 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.687996 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.688050 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.688030 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-config-volume\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.688959 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.688943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ca542e87-c56f-4eb0-b002-010f69fe987c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.697670 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.697652 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmktm\" (UniqueName: \"kubernetes.io/projected/ca542e87-c56f-4eb0-b002-010f69fe987c-kube-api-access-zmktm\") pod \"alertmanager-main-0\" (UID: \"ca542e87-c56f-4eb0-b002-010f69fe987c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:14.878265 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:14.878173 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 19:12:15.006185 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:15.006160 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 19:12:15.007805 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:12:15.007779 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca542e87_c56f_4eb0_b002_010f69fe987c.slice/crio-481f94de950dbc7490bae237f9fd0632f7048ba831b73afdffc5421265da396a WatchSource:0}: Error finding container 481f94de950dbc7490bae237f9fd0632f7048ba831b73afdffc5421265da396a: Status 404 returned error can't find the container with id 481f94de950dbc7490bae237f9fd0632f7048ba831b73afdffc5421265da396a Apr 20 19:12:15.508298 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:15.508209 2571 generic.go:358] "Generic (PLEG): container finished" podID="ca542e87-c56f-4eb0-b002-010f69fe987c" containerID="a0b811ad810cae2fbf7febbecdd08f6f080fdd31b781658145c8b7edac8ad10f" exitCode=0 Apr 20 19:12:15.508724 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:15.508301 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerDied","Data":"a0b811ad810cae2fbf7febbecdd08f6f080fdd31b781658145c8b7edac8ad10f"} Apr 20 19:12:15.508724 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:15.508337 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"481f94de950dbc7490bae237f9fd0632f7048ba831b73afdffc5421265da396a"} Apr 20 19:12:15.691819 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:15.691796 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb88d695-a7a4-44c7-a5d1-1399aa607235" path="/var/lib/kubelet/pods/bb88d695-a7a4-44c7-a5d1-1399aa607235/volumes" Apr 20 19:12:16.515937 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.515899 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"6a04f5dbb9f5740780e7502fc222d30264bf5813b562f5d10f884fc118a81b1f"} Apr 20 19:12:16.515937 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.515942 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"cf5e859e9ca0d7aa18a859ebe1946503d42414bad299682333c26c192dd2aadd"} Apr 20 19:12:16.516358 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.515955 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"0b5ce9746c89d2a0007aa58404b323e7c3c41c926624df2a34b11da1b9968c7d"} Apr 20 19:12:16.516358 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.515965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"e8018f42fe232c49491ed0c76eb740991a6815949ac5c7e60c6351bdf296d954"} Apr 20 19:12:16.516358 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.515976 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"b454855ad46992e108e81cff3d6169382d6639878ca60140ec580b48b1730e84"} Apr 20 19:12:16.516358 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.515987 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ca542e87-c56f-4eb0-b002-010f69fe987c","Type":"ContainerStarted","Data":"2592918a90e2672fae94631732a519ca288872545d897478d94a275ecd9d3a3a"} Apr 20 19:12:16.545877 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:16.545830 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.545791505 podStartE2EDuration="2.545791505s" podCreationTimestamp="2026-04-20 19:12:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:12:16.543789384 +0000 UTC m=+261.457285435" watchObservedRunningTime="2026-04-20 19:12:16.545791505 +0000 UTC m=+261.459287558" Apr 20 19:12:18.229962 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.229926 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-647b786c68-6sfx4"] Apr 20 19:12:18.232486 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.232465 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.235184 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.235166 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 19:12:18.235501 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.235473 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-dpggv\"" Apr 20 19:12:18.235610 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.235525 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 19:12:18.235868 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.235850 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 19:12:18.235915 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.235860 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 19:12:18.237028 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.237004 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 19:12:18.248257 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.248227 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-647b786c68-6sfx4"] Apr 20 19:12:18.248611 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.248588 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 19:12:18.315608 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315573 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-serving-certs-ca-bundle\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315791 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-metrics-client-ca\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315791 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-secret-telemeter-client\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315791 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-federate-client-tls\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315791 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315767 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315919 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4b29\" (UniqueName: \"kubernetes.io/projected/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-kube-api-access-p4b29\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315919 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-telemeter-client-tls\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.315919 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.315890 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.416666 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416611 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-serving-certs-ca-bundle\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.416861 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-metrics-client-ca\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.416861 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-secret-telemeter-client\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.416861 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416778 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-federate-client-tls\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.416861 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416819 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.416861 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416847 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4b29\" (UniqueName: \"kubernetes.io/projected/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-kube-api-access-p4b29\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.417093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-telemeter-client-tls\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.417093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.416920 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.417951 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.417921 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-serving-certs-ca-bundle\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.418072 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.417944 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-metrics-client-ca\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.418428 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.418398 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-telemeter-trusted-ca-bundle\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.419891 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.419863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-federate-client-tls\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.420089 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.420066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.420089 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.420079 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-secret-telemeter-client\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.420216 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.420091 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-telemeter-client-tls\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.426785 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.426759 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4b29\" (UniqueName: \"kubernetes.io/projected/b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa-kube-api-access-p4b29\") pod \"telemeter-client-647b786c68-6sfx4\" (UID: \"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa\") " pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.548819 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.548729 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" Apr 20 19:12:18.691449 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:18.691414 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-647b786c68-6sfx4"] Apr 20 19:12:18.695907 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:12:18.695879 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1f0d9d8_e1cc_48b2_9889_5e8f539cf4fa.slice/crio-15a320a20178769636b64f0e393a5fe25d299a3b9a12a39fc4b4b768e0c552c0 WatchSource:0}: Error finding container 15a320a20178769636b64f0e393a5fe25d299a3b9a12a39fc4b4b768e0c552c0: Status 404 returned error can't find the container with id 15a320a20178769636b64f0e393a5fe25d299a3b9a12a39fc4b4b768e0c552c0 Apr 20 19:12:19.527084 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:19.527046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" event={"ID":"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa","Type":"ContainerStarted","Data":"15a320a20178769636b64f0e393a5fe25d299a3b9a12a39fc4b4b768e0c552c0"} Apr 20 19:12:21.535093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:21.535055 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" event={"ID":"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa","Type":"ContainerStarted","Data":"4df0db26ad75b903353fcc79476b4c91df739e76dbc9edf6801717f481261934"} Apr 20 19:12:21.535093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:21.535094 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" event={"ID":"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa","Type":"ContainerStarted","Data":"6ae62276653d8135c1e1c1180e5f7b47fbc92beac995083f91cc947588fa99c9"} Apr 20 19:12:21.535093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:21.535115 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" event={"ID":"b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa","Type":"ContainerStarted","Data":"ac5e8f63947471061348bd79b16e354e1d0f9fe115dccee8b7426bfb6fdeb77d"} Apr 20 19:12:21.558677 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:21.558626 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-647b786c68-6sfx4" podStartSLOduration=1.7213071260000001 podStartE2EDuration="3.558609276s" podCreationTimestamp="2026-04-20 19:12:18 +0000 UTC" firstStartedPulling="2026-04-20 19:12:18.697956693 +0000 UTC m=+263.611452724" lastFinishedPulling="2026-04-20 19:12:20.535258841 +0000 UTC m=+265.448754874" observedRunningTime="2026-04-20 19:12:21.556558575 +0000 UTC m=+266.470054638" watchObservedRunningTime="2026-04-20 19:12:21.558609276 +0000 UTC m=+266.472105328" Apr 20 19:12:38.994468 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:38.994433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:12:38.996891 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:38.996870 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d4955879-53ea-4f74-96dd-eae67dbbe030-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4hs99\" (UID: \"d4955879-53ea-4f74-96dd-eae67dbbe030\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:12:39.090470 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:39.090441 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-8k5pc\"" Apr 20 19:12:39.099488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:39.099454 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" Apr 20 19:12:39.220601 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:39.220526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4hs99"] Apr 20 19:12:39.222799 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:12:39.222770 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4955879_53ea_4f74_96dd_eae67dbbe030.slice/crio-4d76e32e8190dbf56ba75290ba60ada49a9dd3c9fd0812b7c3f71aa3d06f86bf WatchSource:0}: Error finding container 4d76e32e8190dbf56ba75290ba60ada49a9dd3c9fd0812b7c3f71aa3d06f86bf: Status 404 returned error can't find the container with id 4d76e32e8190dbf56ba75290ba60ada49a9dd3c9fd0812b7c3f71aa3d06f86bf Apr 20 19:12:39.588254 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:39.588214 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" event={"ID":"d4955879-53ea-4f74-96dd-eae67dbbe030","Type":"ContainerStarted","Data":"4d76e32e8190dbf56ba75290ba60ada49a9dd3c9fd0812b7c3f71aa3d06f86bf"} Apr 20 19:12:40.592355 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:40.592262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" event={"ID":"d4955879-53ea-4f74-96dd-eae67dbbe030","Type":"ContainerStarted","Data":"b88b1951b90fcd4d29d704aed57daa5c73f7a78c8957554491675000ec08c87c"} Apr 20 19:12:40.609164 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:40.608954 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4hs99" podStartSLOduration=273.666644958 podStartE2EDuration="4m34.60893593s" podCreationTimestamp="2026-04-20 19:08:06 +0000 UTC" firstStartedPulling="2026-04-20 19:12:39.224707903 +0000 UTC m=+284.138203933" lastFinishedPulling="2026-04-20 19:12:40.166998875 +0000 UTC m=+285.080494905" observedRunningTime="2026-04-20 19:12:40.608652863 +0000 UTC m=+285.522148917" watchObservedRunningTime="2026-04-20 19:12:40.60893593 +0000 UTC m=+285.522432036" Apr 20 19:12:55.564708 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:55.564671 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:12:55.565476 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:55.565453 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:12:55.572352 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:12:55.572330 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:15:30.739996 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.739925 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-7bh4l"] Apr 20 19:15:30.742880 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.742860 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:30.745658 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.745636 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 19:15:30.746829 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.746812 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 19:15:30.746899 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.746834 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-pxtbx\"" Apr 20 19:15:30.753425 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.753403 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-7bh4l"] Apr 20 19:15:30.780406 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.780360 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpzcd\" (UniqueName: \"kubernetes.io/projected/73a213cf-2a36-41e7-9af5-a3fb08269ad3-kube-api-access-bpzcd\") pod \"cert-manager-cainjector-68b757865b-7bh4l\" (UID: \"73a213cf-2a36-41e7-9af5-a3fb08269ad3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:30.780540 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.780440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73a213cf-2a36-41e7-9af5-a3fb08269ad3-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-7bh4l\" (UID: \"73a213cf-2a36-41e7-9af5-a3fb08269ad3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:30.881701 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.881649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpzcd\" (UniqueName: \"kubernetes.io/projected/73a213cf-2a36-41e7-9af5-a3fb08269ad3-kube-api-access-bpzcd\") pod \"cert-manager-cainjector-68b757865b-7bh4l\" (UID: \"73a213cf-2a36-41e7-9af5-a3fb08269ad3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:30.881891 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.881715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73a213cf-2a36-41e7-9af5-a3fb08269ad3-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-7bh4l\" (UID: \"73a213cf-2a36-41e7-9af5-a3fb08269ad3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:30.889788 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.889754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/73a213cf-2a36-41e7-9af5-a3fb08269ad3-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-7bh4l\" (UID: \"73a213cf-2a36-41e7-9af5-a3fb08269ad3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:30.889902 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:30.889834 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpzcd\" (UniqueName: \"kubernetes.io/projected/73a213cf-2a36-41e7-9af5-a3fb08269ad3-kube-api-access-bpzcd\") pod \"cert-manager-cainjector-68b757865b-7bh4l\" (UID: \"73a213cf-2a36-41e7-9af5-a3fb08269ad3\") " pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:31.062598 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:31.062521 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" Apr 20 19:15:31.189852 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:31.189826 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-7bh4l"] Apr 20 19:15:31.192395 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:15:31.192369 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a213cf_2a36_41e7_9af5_a3fb08269ad3.slice/crio-80c2d8cbc63b3ec58da34c2984e18b0e523294fa97008474c0402871680ddde0 WatchSource:0}: Error finding container 80c2d8cbc63b3ec58da34c2984e18b0e523294fa97008474c0402871680ddde0: Status 404 returned error can't find the container with id 80c2d8cbc63b3ec58da34c2984e18b0e523294fa97008474c0402871680ddde0 Apr 20 19:15:31.194237 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:31.194217 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:15:32.079619 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:32.079575 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" event={"ID":"73a213cf-2a36-41e7-9af5-a3fb08269ad3","Type":"ContainerStarted","Data":"80c2d8cbc63b3ec58da34c2984e18b0e523294fa97008474c0402871680ddde0"} Apr 20 19:15:34.087534 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:34.087501 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" event={"ID":"73a213cf-2a36-41e7-9af5-a3fb08269ad3","Type":"ContainerStarted","Data":"91c13befa02a5fc2288b46cc5b8ff5a21e35317535be461437e437db7e0d2ee4"} Apr 20 19:15:34.104779 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:34.104721 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-7bh4l" podStartSLOduration=1.340015596 podStartE2EDuration="4.104706951s" podCreationTimestamp="2026-04-20 19:15:30 +0000 UTC" firstStartedPulling="2026-04-20 19:15:31.194405401 +0000 UTC m=+456.107901438" lastFinishedPulling="2026-04-20 19:15:33.959096762 +0000 UTC m=+458.872592793" observedRunningTime="2026-04-20 19:15:34.10330567 +0000 UTC m=+459.016801723" watchObservedRunningTime="2026-04-20 19:15:34.104706951 +0000 UTC m=+459.018203003" Apr 20 19:15:39.810759 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.810724 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-t48mn"] Apr 20 19:15:39.812886 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.812869 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:39.815529 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.815507 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-vfr27\"" Apr 20 19:15:39.821413 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.821385 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-t48mn"] Apr 20 19:15:39.868840 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.868810 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eebd6fb2-9b39-47f5-a3a2-cbe2919e9526-bound-sa-token\") pod \"cert-manager-79c8d999ff-t48mn\" (UID: \"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526\") " pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:39.868961 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.868845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh75w\" (UniqueName: \"kubernetes.io/projected/eebd6fb2-9b39-47f5-a3a2-cbe2919e9526-kube-api-access-fh75w\") pod \"cert-manager-79c8d999ff-t48mn\" (UID: \"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526\") " pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:39.970236 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.970203 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eebd6fb2-9b39-47f5-a3a2-cbe2919e9526-bound-sa-token\") pod \"cert-manager-79c8d999ff-t48mn\" (UID: \"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526\") " pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:39.970236 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.970238 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh75w\" (UniqueName: \"kubernetes.io/projected/eebd6fb2-9b39-47f5-a3a2-cbe2919e9526-kube-api-access-fh75w\") pod \"cert-manager-79c8d999ff-t48mn\" (UID: \"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526\") " pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:39.978781 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.978754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eebd6fb2-9b39-47f5-a3a2-cbe2919e9526-bound-sa-token\") pod \"cert-manager-79c8d999ff-t48mn\" (UID: \"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526\") " pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:39.978897 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:39.978817 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh75w\" (UniqueName: \"kubernetes.io/projected/eebd6fb2-9b39-47f5-a3a2-cbe2919e9526-kube-api-access-fh75w\") pod \"cert-manager-79c8d999ff-t48mn\" (UID: \"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526\") " pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:40.123091 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:40.123006 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-t48mn" Apr 20 19:15:40.240776 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:40.240751 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-t48mn"] Apr 20 19:15:40.243162 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:15:40.243136 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeebd6fb2_9b39_47f5_a3a2_cbe2919e9526.slice/crio-7fa035eefe33d81e0f1f47e7ecdd35519865075f1806de152669d166debb80f6 WatchSource:0}: Error finding container 7fa035eefe33d81e0f1f47e7ecdd35519865075f1806de152669d166debb80f6: Status 404 returned error can't find the container with id 7fa035eefe33d81e0f1f47e7ecdd35519865075f1806de152669d166debb80f6 Apr 20 19:15:41.108640 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:41.108598 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-t48mn" event={"ID":"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526","Type":"ContainerStarted","Data":"9cf788f69125ad95e4093bdc288c2b9902a8478213dfd7e725c0758a84489d71"} Apr 20 19:15:41.108640 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:41.108641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-t48mn" event={"ID":"eebd6fb2-9b39-47f5-a3a2-cbe2919e9526","Type":"ContainerStarted","Data":"7fa035eefe33d81e0f1f47e7ecdd35519865075f1806de152669d166debb80f6"} Apr 20 19:15:41.125070 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:41.125016 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-t48mn" podStartSLOduration=2.125004085 podStartE2EDuration="2.125004085s" podCreationTimestamp="2026-04-20 19:15:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:15:41.123325859 +0000 UTC m=+466.036821913" watchObservedRunningTime="2026-04-20 19:15:41.125004085 +0000 UTC m=+466.038500137" Apr 20 19:15:59.938123 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.938086 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq"] Apr 20 19:15:59.945545 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.945524 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:15:59.955459 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.955440 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:15:59.955581 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.955543 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:15:59.955827 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.955813 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-kdxcc\"" Apr 20 19:15:59.956470 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.956455 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:15:59.956622 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.956604 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:15:59.981856 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:15:59.981831 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq"] Apr 20 19:16:00.040477 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.040441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b3c7a16-430f-49f5-8a58-d40620e44a47-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.040645 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.040489 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p27b6\" (UniqueName: \"kubernetes.io/projected/4b3c7a16-430f-49f5-8a58-d40620e44a47-kube-api-access-p27b6\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.040645 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.040511 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b3c7a16-430f-49f5-8a58-d40620e44a47-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.141547 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.141509 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b3c7a16-430f-49f5-8a58-d40620e44a47-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.141721 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.141564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p27b6\" (UniqueName: \"kubernetes.io/projected/4b3c7a16-430f-49f5-8a58-d40620e44a47-kube-api-access-p27b6\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.141721 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.141592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b3c7a16-430f-49f5-8a58-d40620e44a47-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.143909 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.143886 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4b3c7a16-430f-49f5-8a58-d40620e44a47-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.144022 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.143905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4b3c7a16-430f-49f5-8a58-d40620e44a47-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.161325 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.161288 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p27b6\" (UniqueName: \"kubernetes.io/projected/4b3c7a16-430f-49f5-8a58-d40620e44a47-kube-api-access-p27b6\") pod \"opendatahub-operator-controller-manager-6c77764cd6-tbjxq\" (UID: \"4b3c7a16-430f-49f5-8a58-d40620e44a47\") " pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.256011 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.255928 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:00.393428 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.393394 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq"] Apr 20 19:16:00.396795 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:16:00.396772 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b3c7a16_430f_49f5_8a58_d40620e44a47.slice/crio-4e5988820ed300a0e4fecacc0183d9bbf6f5f908aabef8768e721b6ffd03d60f WatchSource:0}: Error finding container 4e5988820ed300a0e4fecacc0183d9bbf6f5f908aabef8768e721b6ffd03d60f: Status 404 returned error can't find the container with id 4e5988820ed300a0e4fecacc0183d9bbf6f5f908aabef8768e721b6ffd03d60f Apr 20 19:16:00.853057 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.853013 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz"] Apr 20 19:16:00.856756 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.856727 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:00.860156 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.860133 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:16:00.861530 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.861501 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 19:16:00.861651 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.861581 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:16:00.861651 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.861601 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 19:16:00.861831 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.861505 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 19:16:00.861831 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.861503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-pjvft\"" Apr 20 19:16:00.870387 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.870365 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz"] Apr 20 19:16:00.948438 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.948402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-cert\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:00.948893 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.948470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m5q4\" (UniqueName: \"kubernetes.io/projected/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-kube-api-access-4m5q4\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:00.948893 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.948525 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-manager-config\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:00.948893 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:00.948585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-metrics-cert\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.050069 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.050032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-manager-config\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.050286 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.050079 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-metrics-cert\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.050286 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.050156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-cert\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.050286 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.050200 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m5q4\" (UniqueName: \"kubernetes.io/projected/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-kube-api-access-4m5q4\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.050900 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.050868 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-manager-config\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.052927 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.052905 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-cert\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.053094 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.053068 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-metrics-cert\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.062476 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.062447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m5q4\" (UniqueName: \"kubernetes.io/projected/6f9ab233-6ae1-4487-bd11-4cd0174f8e78-kube-api-access-4m5q4\") pod \"lws-controller-manager-fcf468c68-dtxpz\" (UID: \"6f9ab233-6ae1-4487-bd11-4cd0174f8e78\") " pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.168479 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.168429 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" event={"ID":"4b3c7a16-430f-49f5-8a58-d40620e44a47","Type":"ContainerStarted","Data":"4e5988820ed300a0e4fecacc0183d9bbf6f5f908aabef8768e721b6ffd03d60f"} Apr 20 19:16:01.170673 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.170648 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:01.362820 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:01.362790 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz"] Apr 20 19:16:01.365753 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:16:01.365720 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f9ab233_6ae1_4487_bd11_4cd0174f8e78.slice/crio-458b04830b2c8533e167d22d4ecee53544ab12213abfa3ff17a8d9d7ed6755c3 WatchSource:0}: Error finding container 458b04830b2c8533e167d22d4ecee53544ab12213abfa3ff17a8d9d7ed6755c3: Status 404 returned error can't find the container with id 458b04830b2c8533e167d22d4ecee53544ab12213abfa3ff17a8d9d7ed6755c3 Apr 20 19:16:02.173306 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:02.173254 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" event={"ID":"6f9ab233-6ae1-4487-bd11-4cd0174f8e78","Type":"ContainerStarted","Data":"458b04830b2c8533e167d22d4ecee53544ab12213abfa3ff17a8d9d7ed6755c3"} Apr 20 19:16:03.178517 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:03.178483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" event={"ID":"4b3c7a16-430f-49f5-8a58-d40620e44a47","Type":"ContainerStarted","Data":"0b5712612ef799827da29ffa37a67d20be41062fdddc4b7d34695b77cbc67e17"} Apr 20 19:16:03.178946 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:03.178642 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:03.209123 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:03.208745 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" podStartSLOduration=1.701260967 podStartE2EDuration="4.20872344s" podCreationTimestamp="2026-04-20 19:15:59 +0000 UTC" firstStartedPulling="2026-04-20 19:16:00.398526921 +0000 UTC m=+485.312022951" lastFinishedPulling="2026-04-20 19:16:02.905989391 +0000 UTC m=+487.819485424" observedRunningTime="2026-04-20 19:16:03.205521672 +0000 UTC m=+488.119017725" watchObservedRunningTime="2026-04-20 19:16:03.20872344 +0000 UTC m=+488.122219492" Apr 20 19:16:07.193653 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:07.193613 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" event={"ID":"6f9ab233-6ae1-4487-bd11-4cd0174f8e78","Type":"ContainerStarted","Data":"8873a9eb6b8d2a7eefcd5ba0591e862285662eefee5e21d3512ebe7485387e93"} Apr 20 19:16:07.194030 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:07.193684 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:07.211924 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:07.211869 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" podStartSLOduration=2.239810222 podStartE2EDuration="7.211852992s" podCreationTimestamp="2026-04-20 19:16:00 +0000 UTC" firstStartedPulling="2026-04-20 19:16:01.368389762 +0000 UTC m=+486.281885809" lastFinishedPulling="2026-04-20 19:16:06.340432547 +0000 UTC m=+491.253928579" observedRunningTime="2026-04-20 19:16:07.211020816 +0000 UTC m=+492.124516868" watchObservedRunningTime="2026-04-20 19:16:07.211852992 +0000 UTC m=+492.125349046" Apr 20 19:16:14.184507 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:14.184476 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c77764cd6-tbjxq" Apr 20 19:16:18.199013 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:18.198975 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fcf468c68-dtxpz" Apr 20 19:16:28.449987 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.449944 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5489467c57-vmrcr"] Apr 20 19:16:28.452602 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.452577 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.455347 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.455324 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 19:16:28.456696 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.456674 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 19:16:28.456818 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.456673 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:16:28.456818 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.456677 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:16:28.456818 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.456723 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-hdxhc\"" Apr 20 19:16:28.461604 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.461579 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5489467c57-vmrcr"] Apr 20 19:16:28.473646 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.473623 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72b34b2e-2656-42a6-9aa4-883ab46ef18a-tmp\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.473778 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.473683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72b34b2e-2656-42a6-9aa4-883ab46ef18a-tls-certs\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.473778 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.473745 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqshb\" (UniqueName: \"kubernetes.io/projected/72b34b2e-2656-42a6-9aa4-883ab46ef18a-kube-api-access-sqshb\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.574923 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.574882 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqshb\" (UniqueName: \"kubernetes.io/projected/72b34b2e-2656-42a6-9aa4-883ab46ef18a-kube-api-access-sqshb\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.575165 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.574969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72b34b2e-2656-42a6-9aa4-883ab46ef18a-tmp\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.575165 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.575036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72b34b2e-2656-42a6-9aa4-883ab46ef18a-tls-certs\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.577371 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.577348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72b34b2e-2656-42a6-9aa4-883ab46ef18a-tmp\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.577573 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.577555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72b34b2e-2656-42a6-9aa4-883ab46ef18a-tls-certs\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.583795 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.583769 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqshb\" (UniqueName: \"kubernetes.io/projected/72b34b2e-2656-42a6-9aa4-883ab46ef18a-kube-api-access-sqshb\") pod \"kube-auth-proxy-5489467c57-vmrcr\" (UID: \"72b34b2e-2656-42a6-9aa4-883ab46ef18a\") " pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.763883 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.763782 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" Apr 20 19:16:28.885424 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:28.885400 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5489467c57-vmrcr"] Apr 20 19:16:28.888644 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:16:28.888611 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72b34b2e_2656_42a6_9aa4_883ab46ef18a.slice/crio-612807c243fc865a7d27216e7f8fdff15d65c9bbc00b29dd963d698d60ffc8bd WatchSource:0}: Error finding container 612807c243fc865a7d27216e7f8fdff15d65c9bbc00b29dd963d698d60ffc8bd: Status 404 returned error can't find the container with id 612807c243fc865a7d27216e7f8fdff15d65c9bbc00b29dd963d698d60ffc8bd Apr 20 19:16:29.280187 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:29.280147 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" event={"ID":"72b34b2e-2656-42a6-9aa4-883ab46ef18a","Type":"ContainerStarted","Data":"612807c243fc865a7d27216e7f8fdff15d65c9bbc00b29dd963d698d60ffc8bd"} Apr 20 19:16:33.294905 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:33.294867 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" event={"ID":"72b34b2e-2656-42a6-9aa4-883ab46ef18a","Type":"ContainerStarted","Data":"7eba123e14f784546b0a50354ded8605d61b21f6d4a05bd59bf6cf65c5be29d5"} Apr 20 19:16:33.317730 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:16:33.317685 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5489467c57-vmrcr" podStartSLOduration=1.8769040000000001 podStartE2EDuration="5.31767124s" podCreationTimestamp="2026-04-20 19:16:28 +0000 UTC" firstStartedPulling="2026-04-20 19:16:28.890335279 +0000 UTC m=+513.803831309" lastFinishedPulling="2026-04-20 19:16:32.331102516 +0000 UTC m=+517.244598549" observedRunningTime="2026-04-20 19:16:33.315779347 +0000 UTC m=+518.229275411" watchObservedRunningTime="2026-04-20 19:16:33.31767124 +0000 UTC m=+518.231167292" Apr 20 19:17:55.589612 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:55.589579 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:17:55.590373 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:55.590239 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:17:58.318397 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.318365 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf"] Apr 20 19:17:58.320685 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.320669 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.328264 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.328241 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 19:17:58.329257 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.329235 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-ggnhw\"" Apr 20 19:17:58.329330 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.329300 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:17:58.329389 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.329301 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 19:17:58.329389 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.329356 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:17:58.336293 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.336273 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf"] Apr 20 19:17:58.362417 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.362383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc7mg\" (UniqueName: \"kubernetes.io/projected/855d7667-735f-48b5-bbf6-96a225c77a1b-kube-api-access-wc7mg\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.362597 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.362439 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/855d7667-735f-48b5-bbf6-96a225c77a1b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.362597 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.362469 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d7667-735f-48b5-bbf6-96a225c77a1b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.463590 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.463555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc7mg\" (UniqueName: \"kubernetes.io/projected/855d7667-735f-48b5-bbf6-96a225c77a1b-kube-api-access-wc7mg\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.463801 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.463598 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/855d7667-735f-48b5-bbf6-96a225c77a1b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.463801 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.463627 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d7667-735f-48b5-bbf6-96a225c77a1b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.463993 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:17:58.463814 2571 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 19:17:58.463993 ip-10-0-134-63 kubenswrapper[2571]: E0420 19:17:58.463887 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/855d7667-735f-48b5-bbf6-96a225c77a1b-plugin-serving-cert podName:855d7667-735f-48b5-bbf6-96a225c77a1b nodeName:}" failed. No retries permitted until 2026-04-20 19:17:58.963863127 +0000 UTC m=+603.877359162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/855d7667-735f-48b5-bbf6-96a225c77a1b-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-ttnkf" (UID: "855d7667-735f-48b5-bbf6-96a225c77a1b") : secret "plugin-serving-cert" not found Apr 20 19:17:58.464255 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.464236 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/855d7667-735f-48b5-bbf6-96a225c77a1b-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.478250 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.478228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc7mg\" (UniqueName: \"kubernetes.io/projected/855d7667-735f-48b5-bbf6-96a225c77a1b-kube-api-access-wc7mg\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.966508 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.966473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d7667-735f-48b5-bbf6-96a225c77a1b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:58.968833 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:58.968814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/855d7667-735f-48b5-bbf6-96a225c77a1b-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-ttnkf\" (UID: \"855d7667-735f-48b5-bbf6-96a225c77a1b\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:59.230712 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:59.230621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" Apr 20 19:17:59.354156 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:59.354133 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf"] Apr 20 19:17:59.356407 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:17:59.356379 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod855d7667_735f_48b5_bbf6_96a225c77a1b.slice/crio-0adee1cc0565aa6cd0cf4205f37e2d23c6baae9b1c1f44094a43cddf55768c05 WatchSource:0}: Error finding container 0adee1cc0565aa6cd0cf4205f37e2d23c6baae9b1c1f44094a43cddf55768c05: Status 404 returned error can't find the container with id 0adee1cc0565aa6cd0cf4205f37e2d23c6baae9b1c1f44094a43cddf55768c05 Apr 20 19:17:59.583387 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:17:59.583304 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" event={"ID":"855d7667-735f-48b5-bbf6-96a225c77a1b","Type":"ContainerStarted","Data":"0adee1cc0565aa6cd0cf4205f37e2d23c6baae9b1c1f44094a43cddf55768c05"} Apr 20 19:18:24.679005 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:24.678965 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" event={"ID":"855d7667-735f-48b5-bbf6-96a225c77a1b","Type":"ContainerStarted","Data":"04b8a1360e47bbab1fe2da13d4132f0b81251bfd5e01d841195f555763fb126d"} Apr 20 19:18:24.696889 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:24.696830 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-ttnkf" podStartSLOduration=1.557025299 podStartE2EDuration="26.696811752s" podCreationTimestamp="2026-04-20 19:17:58 +0000 UTC" firstStartedPulling="2026-04-20 19:17:59.357669042 +0000 UTC m=+604.271165075" lastFinishedPulling="2026-04-20 19:18:24.497455494 +0000 UTC m=+629.410951528" observedRunningTime="2026-04-20 19:18:24.696479166 +0000 UTC m=+629.609975217" watchObservedRunningTime="2026-04-20 19:18:24.696811752 +0000 UTC m=+629.610307806" Apr 20 19:18:51.715927 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.715894 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:18:51.897820 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.897774 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:18:51.897991 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.897839 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:18:51.902162 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.902135 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:51.904912 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.904889 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 19:18:51.971956 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.971880 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b143d874-7df0-4f39-a42c-5087bed2ec68-config-file\") pod \"limitador-limitador-78c99df468-kc5m2\" (UID: \"b143d874-7df0-4f39-a42c-5087bed2ec68\") " pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:51.971956 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:51.971912 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92zt4\" (UniqueName: \"kubernetes.io/projected/b143d874-7df0-4f39-a42c-5087bed2ec68-kube-api-access-92zt4\") pod \"limitador-limitador-78c99df468-kc5m2\" (UID: \"b143d874-7df0-4f39-a42c-5087bed2ec68\") " pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:52.072822 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.072786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b143d874-7df0-4f39-a42c-5087bed2ec68-config-file\") pod \"limitador-limitador-78c99df468-kc5m2\" (UID: \"b143d874-7df0-4f39-a42c-5087bed2ec68\") " pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:52.072822 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.072833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92zt4\" (UniqueName: \"kubernetes.io/projected/b143d874-7df0-4f39-a42c-5087bed2ec68-kube-api-access-92zt4\") pod \"limitador-limitador-78c99df468-kc5m2\" (UID: \"b143d874-7df0-4f39-a42c-5087bed2ec68\") " pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:52.073343 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.073316 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/b143d874-7df0-4f39-a42c-5087bed2ec68-config-file\") pod \"limitador-limitador-78c99df468-kc5m2\" (UID: \"b143d874-7df0-4f39-a42c-5087bed2ec68\") " pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:52.081420 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.081393 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92zt4\" (UniqueName: \"kubernetes.io/projected/b143d874-7df0-4f39-a42c-5087bed2ec68-kube-api-access-92zt4\") pod \"limitador-limitador-78c99df468-kc5m2\" (UID: \"b143d874-7df0-4f39-a42c-5087bed2ec68\") " pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:52.212783 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.212741 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:52.318435 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.318403 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-2p8jl"] Apr 20 19:18:52.332936 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.332901 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-2p8jl"] Apr 20 19:18:52.333120 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.332984 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:18:52.336427 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.336401 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nb9rv\"" Apr 20 19:18:52.344124 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.338902 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:18:52.376188 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.376157 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tstcr\" (UniqueName: \"kubernetes.io/projected/046079cc-b7df-48ed-859b-c8d01935a587-kube-api-access-tstcr\") pod \"authorino-7498df8756-2p8jl\" (UID: \"046079cc-b7df-48ed-859b-c8d01935a587\") " pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:18:52.477675 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.477622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tstcr\" (UniqueName: \"kubernetes.io/projected/046079cc-b7df-48ed-859b-c8d01935a587-kube-api-access-tstcr\") pod \"authorino-7498df8756-2p8jl\" (UID: \"046079cc-b7df-48ed-859b-c8d01935a587\") " pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:18:52.486752 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.486723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tstcr\" (UniqueName: \"kubernetes.io/projected/046079cc-b7df-48ed-859b-c8d01935a587-kube-api-access-tstcr\") pod \"authorino-7498df8756-2p8jl\" (UID: \"046079cc-b7df-48ed-859b-c8d01935a587\") " pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:18:52.656352 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.656269 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:18:52.769542 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.769489 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" event={"ID":"b143d874-7df0-4f39-a42c-5087bed2ec68","Type":"ContainerStarted","Data":"25fdcb05d8711c235398afaeaaac5a84dc5adf59ab8e57e72efa9a341b2e6542"} Apr 20 19:18:52.778071 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:52.778043 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-2p8jl"] Apr 20 19:18:52.781237 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:18:52.781209 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod046079cc_b7df_48ed_859b_c8d01935a587.slice/crio-3e021dacfd22bdd4d6ca15d3c120a3bee4e6d7c9574bf648b63abfe85a329aed WatchSource:0}: Error finding container 3e021dacfd22bdd4d6ca15d3c120a3bee4e6d7c9574bf648b63abfe85a329aed: Status 404 returned error can't find the container with id 3e021dacfd22bdd4d6ca15d3c120a3bee4e6d7c9574bf648b63abfe85a329aed Apr 20 19:18:53.775992 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:53.775949 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-2p8jl" event={"ID":"046079cc-b7df-48ed-859b-c8d01935a587","Type":"ContainerStarted","Data":"3e021dacfd22bdd4d6ca15d3c120a3bee4e6d7c9574bf648b63abfe85a329aed"} Apr 20 19:18:57.794401 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:57.794354 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" event={"ID":"b143d874-7df0-4f39-a42c-5087bed2ec68","Type":"ContainerStarted","Data":"a8f232b92a4e936831097ddd93b7ee2c711d1efb5c249c7103e1dd6b1b5b70c8"} Apr 20 19:18:57.794854 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:57.794463 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:18:57.795849 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:57.795828 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-2p8jl" event={"ID":"046079cc-b7df-48ed-859b-c8d01935a587","Type":"ContainerStarted","Data":"eca4a8fa1930bac59b6759a4ff08322c517ebc86cfc0b8abc85879c66423205b"} Apr 20 19:18:57.813853 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:57.813808 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" podStartSLOduration=2.397108819 podStartE2EDuration="6.813796525s" podCreationTimestamp="2026-04-20 19:18:51 +0000 UTC" firstStartedPulling="2026-04-20 19:18:52.347695021 +0000 UTC m=+657.261191067" lastFinishedPulling="2026-04-20 19:18:56.764382743 +0000 UTC m=+661.677878773" observedRunningTime="2026-04-20 19:18:57.811720887 +0000 UTC m=+662.725216938" watchObservedRunningTime="2026-04-20 19:18:57.813796525 +0000 UTC m=+662.727292577" Apr 20 19:18:57.827666 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:18:57.827620 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-2p8jl" podStartSLOduration=1.844690284 podStartE2EDuration="5.827608759s" podCreationTimestamp="2026-04-20 19:18:52 +0000 UTC" firstStartedPulling="2026-04-20 19:18:52.782414424 +0000 UTC m=+657.695910454" lastFinishedPulling="2026-04-20 19:18:56.765332895 +0000 UTC m=+661.678828929" observedRunningTime="2026-04-20 19:18:57.82578196 +0000 UTC m=+662.739278059" watchObservedRunningTime="2026-04-20 19:18:57.827608759 +0000 UTC m=+662.741104810" Apr 20 19:19:08.800042 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:08.800010 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-kc5m2" Apr 20 19:19:27.736864 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.736831 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-2p8jl"] Apr 20 19:19:27.737277 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.737024 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-2p8jl" podUID="046079cc-b7df-48ed-859b-c8d01935a587" containerName="authorino" containerID="cri-o://eca4a8fa1930bac59b6759a4ff08322c517ebc86cfc0b8abc85879c66423205b" gracePeriod=30 Apr 20 19:19:27.894337 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.894305 2571 generic.go:358] "Generic (PLEG): container finished" podID="046079cc-b7df-48ed-859b-c8d01935a587" containerID="eca4a8fa1930bac59b6759a4ff08322c517ebc86cfc0b8abc85879c66423205b" exitCode=0 Apr 20 19:19:27.894491 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.894383 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-2p8jl" event={"ID":"046079cc-b7df-48ed-859b-c8d01935a587","Type":"ContainerDied","Data":"eca4a8fa1930bac59b6759a4ff08322c517ebc86cfc0b8abc85879c66423205b"} Apr 20 19:19:27.978887 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.978862 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:19:27.979287 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.979268 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tstcr\" (UniqueName: \"kubernetes.io/projected/046079cc-b7df-48ed-859b-c8d01935a587-kube-api-access-tstcr\") pod \"046079cc-b7df-48ed-859b-c8d01935a587\" (UID: \"046079cc-b7df-48ed-859b-c8d01935a587\") " Apr 20 19:19:27.981309 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:27.981277 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046079cc-b7df-48ed-859b-c8d01935a587-kube-api-access-tstcr" (OuterVolumeSpecName: "kube-api-access-tstcr") pod "046079cc-b7df-48ed-859b-c8d01935a587" (UID: "046079cc-b7df-48ed-859b-c8d01935a587"). InnerVolumeSpecName "kube-api-access-tstcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:19:28.080083 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:28.079979 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tstcr\" (UniqueName: \"kubernetes.io/projected/046079cc-b7df-48ed-859b-c8d01935a587-kube-api-access-tstcr\") on node \"ip-10-0-134-63.ec2.internal\" DevicePath \"\"" Apr 20 19:19:28.899799 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:28.899767 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-2p8jl" Apr 20 19:19:28.900218 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:28.899762 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-2p8jl" event={"ID":"046079cc-b7df-48ed-859b-c8d01935a587","Type":"ContainerDied","Data":"3e021dacfd22bdd4d6ca15d3c120a3bee4e6d7c9574bf648b63abfe85a329aed"} Apr 20 19:19:28.900218 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:28.899891 2571 scope.go:117] "RemoveContainer" containerID="eca4a8fa1930bac59b6759a4ff08322c517ebc86cfc0b8abc85879c66423205b" Apr 20 19:19:28.932454 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:28.932425 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-2p8jl"] Apr 20 19:19:28.944026 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:28.940737 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-2p8jl"] Apr 20 19:19:29.694387 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:29.694351 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046079cc-b7df-48ed-859b-c8d01935a587" path="/var/lib/kubelet/pods/046079cc-b7df-48ed-859b-c8d01935a587/volumes" Apr 20 19:19:34.835792 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:19:34.835757 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:20:05.210183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:05.210145 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:20:17.465964 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:17.465930 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:20:20.680935 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.680895 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-575fd8c6ff-xbgd2"] Apr 20 19:20:20.681470 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.681451 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="046079cc-b7df-48ed-859b-c8d01935a587" containerName="authorino" Apr 20 19:20:20.681553 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.681473 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="046079cc-b7df-48ed-859b-c8d01935a587" containerName="authorino" Apr 20 19:20:20.681603 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.681562 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="046079cc-b7df-48ed-859b-c8d01935a587" containerName="authorino" Apr 20 19:20:20.684498 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.684477 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.688495 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.688474 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-qbkhb\"" Apr 20 19:20:20.688596 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.688474 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 19:20:20.688596 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.688476 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 19:20:20.691717 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.691694 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-575fd8c6ff-xbgd2"] Apr 20 19:20:20.826318 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.826278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q5db\" (UniqueName: \"kubernetes.io/projected/94b56cd8-00ca-4fea-a35d-ce6cf1c36380-kube-api-access-7q5db\") pod \"maas-api-575fd8c6ff-xbgd2\" (UID: \"94b56cd8-00ca-4fea-a35d-ce6cf1c36380\") " pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.826501 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.826340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/94b56cd8-00ca-4fea-a35d-ce6cf1c36380-maas-api-tls\") pod \"maas-api-575fd8c6ff-xbgd2\" (UID: \"94b56cd8-00ca-4fea-a35d-ce6cf1c36380\") " pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.927519 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.927467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5db\" (UniqueName: \"kubernetes.io/projected/94b56cd8-00ca-4fea-a35d-ce6cf1c36380-kube-api-access-7q5db\") pod \"maas-api-575fd8c6ff-xbgd2\" (UID: \"94b56cd8-00ca-4fea-a35d-ce6cf1c36380\") " pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.927725 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.927540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/94b56cd8-00ca-4fea-a35d-ce6cf1c36380-maas-api-tls\") pod \"maas-api-575fd8c6ff-xbgd2\" (UID: \"94b56cd8-00ca-4fea-a35d-ce6cf1c36380\") " pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.930129 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.930088 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/94b56cd8-00ca-4fea-a35d-ce6cf1c36380-maas-api-tls\") pod \"maas-api-575fd8c6ff-xbgd2\" (UID: \"94b56cd8-00ca-4fea-a35d-ce6cf1c36380\") " pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.935786 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.935732 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q5db\" (UniqueName: \"kubernetes.io/projected/94b56cd8-00ca-4fea-a35d-ce6cf1c36380-kube-api-access-7q5db\") pod \"maas-api-575fd8c6ff-xbgd2\" (UID: \"94b56cd8-00ca-4fea-a35d-ce6cf1c36380\") " pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:20.995987 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:20.995956 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:21.119020 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:21.118996 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-575fd8c6ff-xbgd2"] Apr 20 19:20:21.121182 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:20:21.121154 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b56cd8_00ca_4fea_a35d_ce6cf1c36380.slice/crio-c8a67d9931cace32009d173d9ac746a4f87998ba2708f23f95e2646a5dfcec94 WatchSource:0}: Error finding container c8a67d9931cace32009d173d9ac746a4f87998ba2708f23f95e2646a5dfcec94: Status 404 returned error can't find the container with id c8a67d9931cace32009d173d9ac746a4f87998ba2708f23f95e2646a5dfcec94 Apr 20 19:20:22.084315 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:22.084273 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" event={"ID":"94b56cd8-00ca-4fea-a35d-ce6cf1c36380","Type":"ContainerStarted","Data":"c8a67d9931cace32009d173d9ac746a4f87998ba2708f23f95e2646a5dfcec94"} Apr 20 19:20:24.093443 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:24.093407 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" event={"ID":"94b56cd8-00ca-4fea-a35d-ce6cf1c36380","Type":"ContainerStarted","Data":"7dfd5695efc1c667e625a100360f4c39b779c962166cdaad63882821ec37ac7e"} Apr 20 19:20:24.093810 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:24.093520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:24.111721 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:24.111672 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" podStartSLOduration=1.945562416 podStartE2EDuration="4.111657189s" podCreationTimestamp="2026-04-20 19:20:20 +0000 UTC" firstStartedPulling="2026-04-20 19:20:21.122292511 +0000 UTC m=+746.035788542" lastFinishedPulling="2026-04-20 19:20:23.288387286 +0000 UTC m=+748.201883315" observedRunningTime="2026-04-20 19:20:24.110357839 +0000 UTC m=+749.023853903" watchObservedRunningTime="2026-04-20 19:20:24.111657189 +0000 UTC m=+749.025153240" Apr 20 19:20:30.105619 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:30.105589 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-575fd8c6ff-xbgd2" Apr 20 19:20:30.582771 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:30.582735 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:20:41.261023 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:20:41.260985 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:21:10.873508 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:10.873472 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:21:16.360056 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:16.359973 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:21:38.240225 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.240192 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-546d6cc97f-jlkfd"] Apr 20 19:21:38.242583 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.242564 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.246537 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.246518 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 19:21:38.246652 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.246553 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nb9rv\"" Apr 20 19:21:38.252159 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.252132 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-546d6cc97f-jlkfd"] Apr 20 19:21:38.292572 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.292545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/db0e10bb-5519-428f-93d1-31b239f1b7fb-tls-cert\") pod \"authorino-546d6cc97f-jlkfd\" (UID: \"db0e10bb-5519-428f-93d1-31b239f1b7fb\") " pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.292740 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.292608 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnsqw\" (UniqueName: \"kubernetes.io/projected/db0e10bb-5519-428f-93d1-31b239f1b7fb-kube-api-access-xnsqw\") pod \"authorino-546d6cc97f-jlkfd\" (UID: \"db0e10bb-5519-428f-93d1-31b239f1b7fb\") " pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.393816 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.393783 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnsqw\" (UniqueName: \"kubernetes.io/projected/db0e10bb-5519-428f-93d1-31b239f1b7fb-kube-api-access-xnsqw\") pod \"authorino-546d6cc97f-jlkfd\" (UID: \"db0e10bb-5519-428f-93d1-31b239f1b7fb\") " pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.394018 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.393881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/db0e10bb-5519-428f-93d1-31b239f1b7fb-tls-cert\") pod \"authorino-546d6cc97f-jlkfd\" (UID: \"db0e10bb-5519-428f-93d1-31b239f1b7fb\") " pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.396392 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.396372 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/db0e10bb-5519-428f-93d1-31b239f1b7fb-tls-cert\") pod \"authorino-546d6cc97f-jlkfd\" (UID: \"db0e10bb-5519-428f-93d1-31b239f1b7fb\") " pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.402229 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.402207 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnsqw\" (UniqueName: \"kubernetes.io/projected/db0e10bb-5519-428f-93d1-31b239f1b7fb-kube-api-access-xnsqw\") pod \"authorino-546d6cc97f-jlkfd\" (UID: \"db0e10bb-5519-428f-93d1-31b239f1b7fb\") " pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.552766 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.552684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-546d6cc97f-jlkfd" Apr 20 19:21:38.687377 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.687350 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-546d6cc97f-jlkfd"] Apr 20 19:21:38.689492 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:21:38.689468 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb0e10bb_5519_428f_93d1_31b239f1b7fb.slice/crio-7aaa2ae2ef8547cce12ceed99d40a5af2e6ac094e56d1932ca57460fde0e3f1b WatchSource:0}: Error finding container 7aaa2ae2ef8547cce12ceed99d40a5af2e6ac094e56d1932ca57460fde0e3f1b: Status 404 returned error can't find the container with id 7aaa2ae2ef8547cce12ceed99d40a5af2e6ac094e56d1932ca57460fde0e3f1b Apr 20 19:21:38.690645 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:38.690630 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:21:39.359916 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:39.359888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-546d6cc97f-jlkfd" event={"ID":"db0e10bb-5519-428f-93d1-31b239f1b7fb","Type":"ContainerStarted","Data":"7aaa2ae2ef8547cce12ceed99d40a5af2e6ac094e56d1932ca57460fde0e3f1b"} Apr 20 19:21:40.364842 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:40.364808 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-546d6cc97f-jlkfd" event={"ID":"db0e10bb-5519-428f-93d1-31b239f1b7fb","Type":"ContainerStarted","Data":"3f038b5186b9e4921bac066f3aeaeb44fc35b69c64f5e5425ac26fcced2d2a50"} Apr 20 19:21:40.386040 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:21:40.385988 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-546d6cc97f-jlkfd" podStartSLOduration=1.749894418 podStartE2EDuration="2.385975123s" podCreationTimestamp="2026-04-20 19:21:38 +0000 UTC" firstStartedPulling="2026-04-20 19:21:38.690755804 +0000 UTC m=+823.604251833" lastFinishedPulling="2026-04-20 19:21:39.326836509 +0000 UTC m=+824.240332538" observedRunningTime="2026-04-20 19:21:40.383817165 +0000 UTC m=+825.297313217" watchObservedRunningTime="2026-04-20 19:21:40.385975123 +0000 UTC m=+825.299471169" Apr 20 19:22:05.759281 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:05.759245 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:22:16.560277 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:16.560240 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:22:25.574666 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:25.574621 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:22:36.465879 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:36.465840 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:22:44.670612 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:44.670528 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:22:55.612774 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:55.612747 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:22:55.614262 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:55.614242 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:22:55.857199 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:22:55.857162 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:23:57.374695 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:23:57.374661 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:24:12.360715 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:24:12.360635 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:24:51.968855 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:24:51.968822 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:25:10.063305 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:25:10.063272 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:25:22.463051 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:25:22.463013 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:25:38.973745 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:25:38.973705 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:26:33.369657 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:26:33.369622 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:26:41.973887 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:26:41.973851 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:26:58.762535 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:26:58.759944 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:27:07.265625 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:27:07.265577 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:27:24.070808 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:27:24.070712 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:27:32.062998 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:27:32.062959 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:27:55.637477 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:27:55.637445 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:27:55.639645 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:27:55.639620 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:28:04.563964 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:28:04.563927 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:28:13.460895 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:28:13.460861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:28:22.361973 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:28:22.361939 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:28:30.963094 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:28:30.963048 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:28:38.469882 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:28:38.469848 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:28:56.158348 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:28:56.158264 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:29:06.360369 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:29:06.360331 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:29:53.067734 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:29:53.067698 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:01.559177 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:01.559134 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:11.072851 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:11.072816 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:18.765630 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:18.765549 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:28.064202 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:28.064161 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:36.559091 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:36.559054 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:45.762389 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:45.762354 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:30:54.263123 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:30:54.263072 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:03.371336 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:03.371304 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:11.860887 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:11.860854 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:20.763984 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:20.763944 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:29.870122 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:29.870072 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:38.873062 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:38.873026 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:47.166558 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:47.166343 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:31:56.064797 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:31:56.064765 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:32:04.566511 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:32:04.566469 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:32:14.276658 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:32:14.276624 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:32:21.160839 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:32:21.160804 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:32:55.662379 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:32:55.662342 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:32:55.668950 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:32:55.668929 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:34:39.372695 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:34:39.372661 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:34:44.363620 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:34:44.363538 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:35:11.167955 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:35:11.167921 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:35:17.970149 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:35:17.970102 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:35:26.666360 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:35:26.666325 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:35:37.457480 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:35:37.457438 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:35:46.766873 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:35:46.766837 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:35:56.962034 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:35:56.961987 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:36:06.860492 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:36:06.860455 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:36:15.859594 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:36:15.859509 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:36:25.058824 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:36:25.058783 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:36:35.087820 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:36:35.087784 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:36:44.062173 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:36:44.062133 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:36:49.759474 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:36:49.759442 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:37:17.671793 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:37:17.671755 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:37:55.699603 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:37:55.699572 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:37:55.704849 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:37:55.704827 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:38:06.775093 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:38:06.775057 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:38:16.163415 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:38:16.163375 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:38:24.268913 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:38:24.268875 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:38:32.868770 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:38:32.868730 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:38:41.877210 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:38:41.877167 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:38:54.875132 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:38:54.875078 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:39:03.494977 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:39:03.494937 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:39:10.166020 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:39:10.165985 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:39:19.459391 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:39:19.459306 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:39:28.765925 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:39:28.765885 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:39:36.676040 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:39:36.676006 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:39:47.767590 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:39:47.767552 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:40:04.874860 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:40:04.874822 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:40:13.166664 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:40:13.166626 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:40:22.366902 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:40:22.366865 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:40:29.973769 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:40:29.973730 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:40:47.176920 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:40:47.176839 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:40:55.468993 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:40:55.468945 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:41:04.962303 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:41:04.962266 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:41:13.478226 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:41:13.478188 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:41:21.371443 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:41:21.371402 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:41:30.174165 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:41:30.174129 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:41:39.370016 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:41:39.369978 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:41:52.368137 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:41:52.368087 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:01.676782 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:01.676745 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:13.068677 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:13.068590 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:22.272752 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:22.272709 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:28.962298 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:28.962262 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:38.376177 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:38.376141 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:45.565980 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:45.565937 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:42:55.726885 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:55.726858 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:42:55.736596 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:42:55.736568 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:43:02.467467 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:43:02.467428 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:43:12.477447 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:43:12.477400 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:43:20.479568 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:43:20.479529 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:43:28.965834 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:43:28.965798 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:43:52.658086 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:43:52.657993 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:44:04.861834 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:04.861795 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-kc5m2"] Apr 20 19:44:06.588498 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:06.588462 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-546d6cc97f-jlkfd_db0e10bb-5519-428f-93d1-31b239f1b7fb/authorino/0.log" Apr 20 19:44:10.571538 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:10.571510 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-575fd8c6ff-xbgd2_94b56cd8-00ca-4fea-a35d-ce6cf1c36380/maas-api/0.log" Apr 20 19:44:11.188732 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:11.188701 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c77764cd6-tbjxq_4b3c7a16-430f-49f5-8a58-d40620e44a47/manager/0.log" Apr 20 19:44:12.554074 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:12.554042 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-546d6cc97f-jlkfd_db0e10bb-5519-428f-93d1-31b239f1b7fb/authorino/0.log" Apr 20 19:44:12.893473 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:12.893396 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-ttnkf_855d7667-735f-48b5-bbf6-96a225c77a1b/kuadrant-console-plugin/0.log" Apr 20 19:44:13.248172 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:13.248144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-kc5m2_b143d874-7df0-4f39-a42c-5087bed2ec68/limitador/0.log" Apr 20 19:44:14.060741 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:14.060713 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5489467c57-vmrcr_72b34b2e-2656-42a6-9aa4-883ab46ef18a/kube-auth-proxy/0.log" Apr 20 19:44:19.550372 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.550340 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xgkc7/must-gather-rr9s9"] Apr 20 19:44:19.554430 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.554408 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.557295 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.557274 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xgkc7\"/\"openshift-service-ca.crt\"" Apr 20 19:44:19.558658 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.558642 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xgkc7\"/\"default-dockercfg-7j246\"" Apr 20 19:44:19.558717 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.558649 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xgkc7\"/\"kube-root-ca.crt\"" Apr 20 19:44:19.570013 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.569988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/must-gather-rr9s9"] Apr 20 19:44:19.629728 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.629695 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f01b0a72-c55f-47da-9f41-9e41140446af-must-gather-output\") pod \"must-gather-rr9s9\" (UID: \"f01b0a72-c55f-47da-9f41-9e41140446af\") " pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.629892 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.629737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v7vp\" (UniqueName: \"kubernetes.io/projected/f01b0a72-c55f-47da-9f41-9e41140446af-kube-api-access-2v7vp\") pod \"must-gather-rr9s9\" (UID: \"f01b0a72-c55f-47da-9f41-9e41140446af\") " pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.730085 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.730046 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f01b0a72-c55f-47da-9f41-9e41140446af-must-gather-output\") pod \"must-gather-rr9s9\" (UID: \"f01b0a72-c55f-47da-9f41-9e41140446af\") " pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.730276 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.730130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v7vp\" (UniqueName: \"kubernetes.io/projected/f01b0a72-c55f-47da-9f41-9e41140446af-kube-api-access-2v7vp\") pod \"must-gather-rr9s9\" (UID: \"f01b0a72-c55f-47da-9f41-9e41140446af\") " pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.730488 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.730459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f01b0a72-c55f-47da-9f41-9e41140446af-must-gather-output\") pod \"must-gather-rr9s9\" (UID: \"f01b0a72-c55f-47da-9f41-9e41140446af\") " pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.737860 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.737837 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v7vp\" (UniqueName: \"kubernetes.io/projected/f01b0a72-c55f-47da-9f41-9e41140446af-kube-api-access-2v7vp\") pod \"must-gather-rr9s9\" (UID: \"f01b0a72-c55f-47da-9f41-9e41140446af\") " pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.863607 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.863530 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/must-gather-rr9s9" Apr 20 19:44:19.984801 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.984779 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/must-gather-rr9s9"] Apr 20 19:44:19.987557 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:44:19.987530 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf01b0a72_c55f_47da_9f41_9e41140446af.slice/crio-3c9de08aa17d5422e9f78351e924d3bc1bc1f449ed58cb6b7017802732b651de WatchSource:0}: Error finding container 3c9de08aa17d5422e9f78351e924d3bc1bc1f449ed58cb6b7017802732b651de: Status 404 returned error can't find the container with id 3c9de08aa17d5422e9f78351e924d3bc1bc1f449ed58cb6b7017802732b651de Apr 20 19:44:19.989235 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:19.989216 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:44:20.982170 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:20.982143 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/must-gather-rr9s9" event={"ID":"f01b0a72-c55f-47da-9f41-9e41140446af","Type":"ContainerStarted","Data":"3c9de08aa17d5422e9f78351e924d3bc1bc1f449ed58cb6b7017802732b651de"} Apr 20 19:44:21.987810 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:21.987779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/must-gather-rr9s9" event={"ID":"f01b0a72-c55f-47da-9f41-9e41140446af","Type":"ContainerStarted","Data":"dd925117b42c41f75d4d3f6e82229afe5eb82ed2ebc9f7b013b4d53ba83977c2"} Apr 20 19:44:21.988278 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:21.987816 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/must-gather-rr9s9" event={"ID":"f01b0a72-c55f-47da-9f41-9e41140446af","Type":"ContainerStarted","Data":"801569f0e2a2950a75a347b7ce1adeb287bb76ecae62456b2d7f070d620687a5"} Apr 20 19:44:22.005596 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:22.005541 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xgkc7/must-gather-rr9s9" podStartSLOduration=2.126731589 podStartE2EDuration="3.005523915s" podCreationTimestamp="2026-04-20 19:44:19 +0000 UTC" firstStartedPulling="2026-04-20 19:44:19.989416352 +0000 UTC m=+2184.902912387" lastFinishedPulling="2026-04-20 19:44:20.868208665 +0000 UTC m=+2185.781704713" observedRunningTime="2026-04-20 19:44:22.002981482 +0000 UTC m=+2186.916477535" watchObservedRunningTime="2026-04-20 19:44:22.005523915 +0000 UTC m=+2186.919019967" Apr 20 19:44:22.446245 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:22.446217 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7dbl5_67f46846-12c3-4c76-a68a-b367e050cf51/global-pull-secret-syncer/0.log" Apr 20 19:44:22.617071 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:22.617030 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s26tp_b3d91ffb-3895-4e12-a9f2-4d614bd77c3e/konnectivity-agent/0.log" Apr 20 19:44:22.677889 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:22.677848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-63.ec2.internal_500ebaba3a201fbd9b46f7798d1de76f/haproxy/0.log" Apr 20 19:44:26.334097 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:26.334009 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-546d6cc97f-jlkfd_db0e10bb-5519-428f-93d1-31b239f1b7fb/authorino/0.log" Apr 20 19:44:26.416396 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:26.416226 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-ttnkf_855d7667-735f-48b5-bbf6-96a225c77a1b/kuadrant-console-plugin/0.log" Apr 20 19:44:26.556291 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:26.556199 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-kc5m2_b143d874-7df0-4f39-a42c-5087bed2ec68/limitador/0.log" Apr 20 19:44:27.858168 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.858143 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/alertmanager/0.log" Apr 20 19:44:27.877345 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.876859 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/config-reloader/0.log" Apr 20 19:44:27.898739 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.898703 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/kube-rbac-proxy-web/0.log" Apr 20 19:44:27.916968 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.916937 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/kube-rbac-proxy/0.log" Apr 20 19:44:27.936162 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.936127 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/kube-rbac-proxy-metric/0.log" Apr 20 19:44:27.954183 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.954153 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/prom-label-proxy/0.log" Apr 20 19:44:27.971313 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:27.971284 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ca542e87-c56f-4eb0-b002-010f69fe987c/init-config-reloader/0.log" Apr 20 19:44:28.009468 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.009425 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vnjmw_89999f13-84c4-4b08-a865-c986f4298fcb/cluster-monitoring-operator/0.log" Apr 20 19:44:28.033423 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.033388 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2sxrw_1bf31385-2f55-475c-849b-f4ceb3ac894b/kube-state-metrics/0.log" Apr 20 19:44:28.055146 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.055100 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2sxrw_1bf31385-2f55-475c-849b-f4ceb3ac894b/kube-rbac-proxy-main/0.log" Apr 20 19:44:28.090449 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.090415 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2sxrw_1bf31385-2f55-475c-849b-f4ceb3ac894b/kube-rbac-proxy-self/0.log" Apr 20 19:44:28.126221 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.126093 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-8659f8b99f-8q6qm_4e7f8208-d67f-45c0-92a0-ee2371caa82c/metrics-server/0.log" Apr 20 19:44:28.147560 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.147533 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hffb4_5065e954-b3de-4d40-9814-b493ad51776a/monitoring-plugin/0.log" Apr 20 19:44:28.291477 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.291442 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zglz9_817e7306-d6d2-47c0-895a-14dc5408d1d6/node-exporter/0.log" Apr 20 19:44:28.313318 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.313293 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zglz9_817e7306-d6d2-47c0-895a-14dc5408d1d6/kube-rbac-proxy/0.log" Apr 20 19:44:28.330825 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.330799 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zglz9_817e7306-d6d2-47c0-895a-14dc5408d1d6/init-textfile/0.log" Apr 20 19:44:28.581852 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.581818 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9nwvw_c1364ffd-2a9a-4f02-941b-d6c600aaebd2/prometheus-operator/0.log" Apr 20 19:44:28.595189 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.595160 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9nwvw_c1364ffd-2a9a-4f02-941b-d6c600aaebd2/kube-rbac-proxy/0.log" Apr 20 19:44:28.616560 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.616530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-99t2q_c3efa91a-e01d-4412-96f9-53621efefdd3/prometheus-operator-admission-webhook/0.log" Apr 20 19:44:28.666468 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.666440 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-647b786c68-6sfx4_b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa/telemeter-client/0.log" Apr 20 19:44:28.685779 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.685752 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-647b786c68-6sfx4_b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa/reload/0.log" Apr 20 19:44:28.712521 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:28.712482 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-647b786c68-6sfx4_b1f0d9d8-e1cc-48b2-9889-5e8f539cf4fa/kube-rbac-proxy/0.log" Apr 20 19:44:30.122572 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:30.122543 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4hs99_d4955879-53ea-4f74-96dd-eae67dbbe030/networking-console-plugin/0.log" Apr 20 19:44:31.150209 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.150177 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-6p5l6_b945cabb-3adb-4bb8-868d-60d1d730cf72/download-server/0.log" Apr 20 19:44:31.308126 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.308025 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c"] Apr 20 19:44:31.313925 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.313902 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.318032 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.318009 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c"] Apr 20 19:44:31.451977 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.451940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-sys\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.452331 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.452302 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-lib-modules\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.452508 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.452492 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-podres\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.452688 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.452672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22hsz\" (UniqueName: \"kubernetes.io/projected/6fed2868-a063-4035-9b6e-558c98ba03e1-kube-api-access-22hsz\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.452893 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.452876 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-proc\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.554520 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.554480 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-lib-modules\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.554782 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.554759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-podres\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.554889 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-podres\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.554911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22hsz\" (UniqueName: \"kubernetes.io/projected/6fed2868-a063-4035-9b6e-558c98ba03e1-kube-api-access-22hsz\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.554699 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-lib-modules\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.555002 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-proc\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.555078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-sys\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.555208 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-sys\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.555285 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.555264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6fed2868-a063-4035-9b6e-558c98ba03e1-proc\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.563348 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.563297 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22hsz\" (UniqueName: \"kubernetes.io/projected/6fed2868-a063-4035-9b6e-558c98ba03e1-kube-api-access-22hsz\") pod \"perf-node-gather-daemonset-dd97c\" (UID: \"6fed2868-a063-4035-9b6e-558c98ba03e1\") " pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.625999 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.625964 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:31.637686 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.637609 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-ppjpz_f12f4cad-52fc-4e03-8ed1-7fe6e9596b7c/volume-data-source-validator/0.log" Apr 20 19:44:31.794807 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:31.794593 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c"] Apr 20 19:44:31.797907 ip-10-0-134-63 kubenswrapper[2571]: W0420 19:44:31.797877 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6fed2868_a063_4035_9b6e_558c98ba03e1.slice/crio-da7211be831a905795f9c20528df3b6f49b231b068b58bfcadd182e51e2539a0 WatchSource:0}: Error finding container da7211be831a905795f9c20528df3b6f49b231b068b58bfcadd182e51e2539a0: Status 404 returned error can't find the container with id da7211be831a905795f9c20528df3b6f49b231b068b58bfcadd182e51e2539a0 Apr 20 19:44:32.060935 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.060846 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" event={"ID":"6fed2868-a063-4035-9b6e-558c98ba03e1","Type":"ContainerStarted","Data":"9648fdf5b42f43046bff33bb37ba0d8a7eff1a0ccb625a7d93b75c8db50f47f2"} Apr 20 19:44:32.060935 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.060888 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" event={"ID":"6fed2868-a063-4035-9b6e-558c98ba03e1","Type":"ContainerStarted","Data":"da7211be831a905795f9c20528df3b6f49b231b068b58bfcadd182e51e2539a0"} Apr 20 19:44:32.061182 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.060965 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:32.076989 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.076942 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" podStartSLOduration=1.076926983 podStartE2EDuration="1.076926983s" podCreationTimestamp="2026-04-20 19:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:44:32.075130252 +0000 UTC m=+2196.988626305" watchObservedRunningTime="2026-04-20 19:44:32.076926983 +0000 UTC m=+2196.990423073" Apr 20 19:44:32.425906 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.425881 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7kxxk_4937ec4e-cf01-4c28-aa08-5a8a7d722ff9/dns/0.log" Apr 20 19:44:32.448928 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.448900 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7kxxk_4937ec4e-cf01-4c28-aa08-5a8a7d722ff9/kube-rbac-proxy/0.log" Apr 20 19:44:32.550882 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:32.550847 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-k97mn_79e9d9f2-39b0-4a1d-9e82-98ceca85b745/dns-node-resolver/0.log" Apr 20 19:44:33.125012 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:33.124956 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jq7q4_6afa7914-d2d0-4077-b293-73873dd1cb3e/node-ca/0.log" Apr 20 19:44:34.070442 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:34.070414 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5489467c57-vmrcr_72b34b2e-2656-42a6-9aa4-883ab46ef18a/kube-auth-proxy/0.log" Apr 20 19:44:34.620953 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:34.620920 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mwrz2_f543f92c-946e-4d67-ad5f-48be19c49af7/serve-healthcheck-canary/0.log" Apr 20 19:44:35.058855 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:35.058811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-slmw8_4cf6648c-0d3c-45e5-877a-2ef099dd0653/insights-operator/0.log" Apr 20 19:44:35.060167 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:35.060146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-slmw8_4cf6648c-0d3c-45e5-877a-2ef099dd0653/insights-operator/1.log" Apr 20 19:44:35.124689 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:35.124662 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wzddr_53834f86-30e8-4fea-bc1c-05405718e03a/kube-rbac-proxy/0.log" Apr 20 19:44:35.140041 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:35.140015 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wzddr_53834f86-30e8-4fea-bc1c-05405718e03a/exporter/0.log" Apr 20 19:44:35.156972 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:35.156945 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wzddr_53834f86-30e8-4fea-bc1c-05405718e03a/extractor/0.log" Apr 20 19:44:37.061737 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:37.061712 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-575fd8c6ff-xbgd2_94b56cd8-00ca-4fea-a35d-ce6cf1c36380/maas-api/0.log" Apr 20 19:44:37.258051 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:37.258023 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c77764cd6-tbjxq_4b3c7a16-430f-49f5-8a58-d40620e44a47/manager/0.log" Apr 20 19:44:38.075845 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:38.075812 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xgkc7/perf-node-gather-daemonset-dd97c" Apr 20 19:44:38.311888 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:38.311855 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fcf468c68-dtxpz_6f9ab233-6ae1-4487-bd11-4cd0174f8e78/manager/0.log" Apr 20 19:44:42.548242 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:42.548214 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-44chr_37fe3341-d630-483c-b4c5-f4ce6bedf386/migrator/0.log" Apr 20 19:44:42.563534 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:42.563507 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-44chr_37fe3341-d630-483c-b4c5-f4ce6bedf386/graceful-termination/0.log" Apr 20 19:44:43.738599 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.738566 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/kube-multus-additional-cni-plugins/0.log" Apr 20 19:44:43.755322 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.755300 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/egress-router-binary-copy/0.log" Apr 20 19:44:43.771388 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.771361 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/cni-plugins/0.log" Apr 20 19:44:43.786813 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.786793 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/bond-cni-plugin/0.log" Apr 20 19:44:43.801916 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.801896 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/routeoverride-cni/0.log" Apr 20 19:44:43.817922 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.817902 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/whereabouts-cni-bincopy/0.log" Apr 20 19:44:43.832968 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:43.832941 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kj7t2_b0d252f8-fb8c-486c-aaa7-1197a96b6cfd/whereabouts-cni/0.log" Apr 20 19:44:44.181582 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:44.181554 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pzns4_125940c7-e0e5-43b5-a864-11cb9ced899b/kube-multus/0.log" Apr 20 19:44:44.229647 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:44.229619 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ff5pq_cc31ab16-2946-4d9a-baee-c02a00b73aae/network-metrics-daemon/0.log" Apr 20 19:44:44.245398 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:44.245376 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ff5pq_cc31ab16-2946-4d9a-baee-c02a00b73aae/kube-rbac-proxy/0.log" Apr 20 19:44:45.546013 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.545987 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-controller/0.log" Apr 20 19:44:45.558733 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.558697 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/0.log" Apr 20 19:44:45.568347 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.568325 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovn-acl-logging/1.log" Apr 20 19:44:45.582898 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.582877 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/kube-rbac-proxy-node/0.log" Apr 20 19:44:45.598571 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.598546 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:44:45.611188 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.611164 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/northd/0.log" Apr 20 19:44:45.626534 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.626511 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/nbdb/0.log" Apr 20 19:44:45.642934 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.642914 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/sbdb/0.log" Apr 20 19:44:45.748483 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:45.748460 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j8mkl_6d80f226-e162-44b2-8b76-4d5f89d97859/ovnkube-controller/0.log" Apr 20 19:44:46.811219 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:46.811182 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2qjll_838f1a66-5c9b-4d0f-90b0-35a81df852d0/network-check-target-container/0.log" Apr 20 19:44:47.742486 ip-10-0-134-63 kubenswrapper[2571]: I0420 19:44:47.742418 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4d89g_2fb0b1f7-00e8-4e8a-bab0-49d08606cf30/iptables-alerter/0.log"