Apr 17 11:13:46.701636 ip-10-0-135-81 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 11:13:46.701647 ip-10-0-135-81 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 11:13:46.701656 ip-10-0-135-81 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 11:13:46.701959 ip-10-0-135-81 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 11:13:56.843107 ip-10-0-135-81 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 11:13:56.843124 ip-10-0-135-81 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b4c4bb4a00624121b10df1b2b8a79ece -- Apr 17 11:16:10.714700 ip-10-0-135-81 systemd[1]: Starting Kubernetes Kubelet... Apr 17 11:16:11.105502 ip-10-0-135-81 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:11.105502 ip-10-0-135-81 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 11:16:11.105502 ip-10-0-135-81 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:11.105502 ip-10-0-135-81 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 11:16:11.105502 ip-10-0-135-81 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 11:16:11.107547 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.107409 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 11:16:11.112422 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112405 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.112422 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112421 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112425 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112428 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112431 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112434 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112437 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112440 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112442 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112445 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112448 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112451 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112461 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112465 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112468 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112471 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112473 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112476 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112479 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112481 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112483 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.112490 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112486 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112489 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112491 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112494 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112497 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112500 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112503 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112506 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112508 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112511 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112513 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112516 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112518 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112520 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112525 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112528 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112531 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112534 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112537 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.112963 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112540 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112542 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112545 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112547 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112550 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112552 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112555 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112557 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112560 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112562 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112565 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112567 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112570 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112572 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112575 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112578 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112581 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112584 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112587 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112590 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.113474 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112593 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112595 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112598 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112601 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112603 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112606 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112608 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112610 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112613 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112616 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112619 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112622 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112625 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112629 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112631 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112634 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112637 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112639 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112643 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.113968 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112646 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112648 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112652 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112656 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112659 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112662 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.112665 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113659 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113667 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113670 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113673 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113676 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113679 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113682 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113684 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113687 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113690 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113693 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113695 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.114523 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113698 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113701 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113704 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113706 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113708 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113711 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113714 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113717 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113720 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113722 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113725 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113727 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113730 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113732 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113735 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113737 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113739 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113742 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113745 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.114986 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113747 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113750 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113752 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113755 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113757 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113760 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113763 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113765 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113768 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113771 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113774 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113777 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113779 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113781 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113784 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113787 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113789 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113792 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113794 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113797 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.115480 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113805 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113808 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113810 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113812 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113815 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113818 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113820 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113823 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113825 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113828 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113832 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113836 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113839 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113841 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113844 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113846 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113850 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113854 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113857 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.116011 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113862 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113866 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113871 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113875 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113878 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113880 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113884 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113887 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113889 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113892 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113895 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113898 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113901 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113904 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113912 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.113915 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114000 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114017 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114026 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114033 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114054 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 11:16:11.116512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114059 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114064 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114069 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114073 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114076 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114083 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114087 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114090 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114093 2575 flags.go:64] FLAG: --cgroup-root="" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114096 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114099 2575 flags.go:64] FLAG: --client-ca-file="" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114102 2575 flags.go:64] FLAG: --cloud-config="" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114105 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114108 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114115 2575 flags.go:64] FLAG: --cluster-domain="" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114118 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114123 2575 flags.go:64] FLAG: --config-dir="" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114126 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114130 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114134 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114137 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114140 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114143 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114146 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 17 11:16:11.117025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114149 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114152 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114161 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114165 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114170 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114173 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114176 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114179 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114182 2575 flags.go:64] FLAG: --enable-server="true" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114185 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114192 2575 flags.go:64] FLAG: --event-burst="100" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114195 2575 flags.go:64] FLAG: --event-qps="50" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114198 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114203 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114206 2575 flags.go:64] FLAG: --eviction-hard="" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114210 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114213 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114216 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114219 2575 flags.go:64] FLAG: --eviction-soft="" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114222 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114225 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114228 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114231 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114234 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114237 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 11:16:11.117622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114240 2575 flags.go:64] FLAG: --feature-gates="" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114243 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114246 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114249 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114253 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114256 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114259 2575 flags.go:64] FLAG: --help="false" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114261 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114268 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114271 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114275 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114278 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114282 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114285 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114287 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114290 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114293 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114296 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114299 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114302 2575 flags.go:64] FLAG: --kube-reserved="" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114306 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114309 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114312 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114315 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 11:16:11.118235 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114318 2575 flags.go:64] FLAG: --lock-file="" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114321 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114324 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114328 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114334 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114348 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114351 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114354 2575 flags.go:64] FLAG: --logging-format="text" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114357 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114361 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114363 2575 flags.go:64] FLAG: --manifest-url="" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114367 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114371 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114374 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114378 2575 flags.go:64] FLAG: --max-pods="110" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114382 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114386 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114389 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114392 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114396 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114399 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114402 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114411 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114414 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114417 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 11:16:11.118837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114420 2575 flags.go:64] FLAG: --pod-cidr="" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114423 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114430 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114437 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114441 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114444 2575 flags.go:64] FLAG: --port="10250" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114447 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114450 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0940adcf5f8462394" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114454 2575 flags.go:64] FLAG: --qos-reserved="" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114457 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114461 2575 flags.go:64] FLAG: --register-node="true" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114463 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114467 2575 flags.go:64] FLAG: --register-with-taints="" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114470 2575 flags.go:64] FLAG: --registry-burst="10" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114473 2575 flags.go:64] FLAG: --registry-qps="5" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114476 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114479 2575 flags.go:64] FLAG: --reserved-memory="" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114483 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114486 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114489 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114492 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114495 2575 flags.go:64] FLAG: --runonce="false" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114498 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114501 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114505 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 17 11:16:11.119445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114508 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114511 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114514 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114518 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114521 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114524 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114527 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114530 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114534 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114537 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114541 2575 flags.go:64] FLAG: --system-cgroups="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114544 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114550 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114552 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114556 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114560 2575 flags.go:64] FLAG: --tls-min-version="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114563 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114565 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114568 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114571 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114574 2575 flags.go:64] FLAG: --v="2" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114579 2575 flags.go:64] FLAG: --version="false" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114583 2575 flags.go:64] FLAG: --vmodule="" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114588 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.114591 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 11:16:11.120052 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114694 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114698 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114701 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114704 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114707 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114710 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114714 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114717 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114720 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114723 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114726 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114729 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114731 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114734 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114736 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114739 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114741 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114745 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114748 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114751 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.120711 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114753 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114756 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114758 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114761 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114764 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114766 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114769 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114771 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114774 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114776 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114779 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114781 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114784 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114786 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114789 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114792 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114795 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114797 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114800 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114802 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.121285 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114805 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114808 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114810 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114814 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114816 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114819 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114821 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114824 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114826 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114830 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114832 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114835 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114838 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114840 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114842 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114845 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114847 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114850 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114852 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114855 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.121802 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114858 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114860 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114863 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114865 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114868 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114871 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114874 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114876 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114879 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114882 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114885 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114887 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114890 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114892 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114895 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114897 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114902 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114905 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114908 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.122306 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114912 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114915 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114921 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114924 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114926 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114929 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.114931 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.115498 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.122107 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.122128 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122178 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122184 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122187 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122190 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122192 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.122788 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122195 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122198 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122200 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122203 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122206 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122209 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122211 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122214 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122217 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122219 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122222 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122224 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122229 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122238 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122241 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122244 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122246 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122249 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122252 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.123174 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122255 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122257 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122260 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122262 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122265 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122267 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122270 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122279 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122282 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122285 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122288 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122291 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122293 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122296 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122298 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122301 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122303 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122306 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122309 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122311 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.123663 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122314 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122317 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122320 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122322 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122325 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122327 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122331 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122348 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122358 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122361 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122364 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122367 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122369 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122372 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122375 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122378 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122380 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122383 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122385 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.124153 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122389 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122397 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122400 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122402 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122405 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122408 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122410 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122413 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122415 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122418 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122420 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122422 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122425 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122427 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122430 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122432 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122435 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122437 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122439 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122442 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.124719 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122444 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122447 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122449 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.122454 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122578 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122584 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122587 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122590 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122592 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122595 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122598 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122600 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122603 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122606 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122615 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122619 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 11:16:11.125236 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122621 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122624 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122626 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122629 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122631 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122634 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122637 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122639 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122641 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122644 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122646 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122649 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122651 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122654 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122656 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122659 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122662 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122664 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122667 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122669 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 11:16:11.125665 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122671 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122674 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122676 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122679 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122682 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122684 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122687 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122689 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122692 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122694 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122697 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122705 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122707 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122710 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122712 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122715 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122717 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122720 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122722 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 11:16:11.126178 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122725 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122727 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122729 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122733 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122737 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122739 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122741 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122744 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122746 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122749 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122751 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122755 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122758 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122761 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122764 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122766 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122769 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122772 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122775 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 11:16:11.126736 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122777 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122780 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122782 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122785 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122787 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122790 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122800 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122803 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122806 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122809 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122811 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122814 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122816 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122819 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122821 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 11:16:11.127248 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:11.122824 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 11:16:11.127669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.122829 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 11:16:11.127669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.123543 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 11:16:11.127669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.125668 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 11:16:11.127669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.126496 2575 server.go:1019] "Starting client certificate rotation" Apr 17 11:16:11.127669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.126610 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:11.127669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.126673 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 11:16:11.147416 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.147393 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:11.150220 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.150187 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 11:16:11.161589 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.161567 2575 log.go:25] "Validated CRI v1 runtime API" Apr 17 11:16:11.166720 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.166703 2575 log.go:25] "Validated CRI v1 image API" Apr 17 11:16:11.168051 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.168032 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 11:16:11.172534 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.172507 2575 fs.go:135] Filesystem UUIDs: map[239cc1fa-15d3-4835-94bc-fdba56af3a93:/dev/nvme0n1p4 29b28f12-8ece-4248-ab9a-141b5099a640:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 11:16:11.172618 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.172534 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 11:16:11.178333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.178199 2575 manager.go:217] Machine: {Timestamp:2026-04-17 11:16:11.176591517 +0000 UTC m=+0.346398573 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3128808 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a669a138af4c7ffb7e3952193654a SystemUUID:ec2a669a-138a-f4c7-ffb7-e3952193654a BootID:b4c4bb4a-0062-4121-b10d-f1b2b8a79ece Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2e:f9:b2:7d:95 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2e:f9:b2:7d:95 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:db:2b:f8:77:a6 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 11:16:11.178333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.178326 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 11:16:11.178448 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.178425 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 11:16:11.179370 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.179329 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 11:16:11.179532 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.179371 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-81.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 11:16:11.179578 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.179540 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 11:16:11.179578 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.179548 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 11:16:11.179578 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.179561 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:11.180205 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.180195 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 11:16:11.181104 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.181094 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:11.181214 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.181206 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 11:16:11.183776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.183766 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 17 11:16:11.183814 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.183780 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 11:16:11.183814 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.183796 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 11:16:11.183814 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.183810 2575 kubelet.go:397] "Adding apiserver pod source" Apr 17 11:16:11.183905 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.183820 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 11:16:11.184988 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.184975 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:11.185081 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.184995 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 11:16:11.187590 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.187556 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 11:16:11.188585 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.188566 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:11.189538 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.189524 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 11:16:11.190813 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190800 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 11:16:11.190853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190823 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 11:16:11.190853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190831 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 11:16:11.190853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190836 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 11:16:11.190853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190842 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 11:16:11.190853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190848 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 11:16:11.190853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190853 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 11:16:11.191017 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190859 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 11:16:11.191017 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190877 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 11:16:11.191017 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190885 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 11:16:11.191017 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190895 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 11:16:11.191017 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.190904 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 11:16:11.191597 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.191587 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 11:16:11.191627 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.191597 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 11:16:11.195204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.195190 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 11:16:11.195265 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.195226 2575 server.go:1295] "Started kubelet" Apr 17 11:16:11.195396 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.195327 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 11:16:11.195396 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.195359 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 11:16:11.195510 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.195427 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 11:16:11.195999 ip-10-0-135-81 systemd[1]: Started Kubernetes Kubelet. Apr 17 11:16:11.197406 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.197384 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 11:16:11.199848 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.199834 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 17 11:16:11.202692 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.202670 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-81.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 11:16:11.202835 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.202756 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 11:16:11.202939 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.202773 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-81.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 11:16:11.203694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.203677 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:11.204166 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.204151 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 11:16:11.205212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205190 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 11:16:11.205212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205191 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 11:16:11.205389 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205220 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 11:16:11.205389 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.205246 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 11:16:11.205487 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205388 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 17 11:16:11.205487 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205402 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 17 11:16:11.205487 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205405 2575 factory.go:55] Registering systemd factory Apr 17 11:16:11.205487 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205429 2575 factory.go:223] Registration of the systemd container factory successfully Apr 17 11:16:11.205696 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205668 2575 factory.go:153] Registering CRI-O factory Apr 17 11:16:11.205696 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205682 2575 factory.go:223] Registration of the crio container factory successfully Apr 17 11:16:11.205921 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205727 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 11:16:11.205921 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205749 2575 factory.go:103] Registering Raw factory Apr 17 11:16:11.205921 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.205760 2575 manager.go:1196] Started watching for new ooms in manager Apr 17 11:16:11.206124 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.205329 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.206551 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.206535 2575 manager.go:319] Starting recovery of all containers Apr 17 11:16:11.213008 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.212984 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 11:16:11.213404 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.213379 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-81.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 11:16:11.213485 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.213392 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 11:16:11.213876 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.213002 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-81.ec2.internal.18a720bc505d9f2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-81.ec2.internal,UID:ip-10-0-135-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-81.ec2.internal,},FirstTimestamp:2026-04-17 11:16:11.195203372 +0000 UTC m=+0.365010428,LastTimestamp:2026-04-17 11:16:11.195203372 +0000 UTC m=+0.365010428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-81.ec2.internal,}" Apr 17 11:16:11.216528 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.216353 2575 manager.go:324] Recovery completed Apr 17 11:16:11.221494 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.221481 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.223876 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.223862 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.223934 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.223889 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.223934 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.223900 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.224401 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.224387 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 11:16:11.224401 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.224400 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 11:16:11.224522 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.224418 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 17 11:16:11.227029 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.226962 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-81.ec2.internal.18a720bc521325e3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-81.ec2.internal,UID:ip-10-0-135-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-135-81.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-135-81.ec2.internal,},FirstTimestamp:2026-04-17 11:16:11.223877091 +0000 UTC m=+0.393684146,LastTimestamp:2026-04-17 11:16:11.223877091 +0000 UTC m=+0.393684146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-81.ec2.internal,}" Apr 17 11:16:11.227695 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.227682 2575 policy_none.go:49] "None policy: Start" Apr 17 11:16:11.227758 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.227699 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 11:16:11.227758 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.227709 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 17 11:16:11.241982 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.241913 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-81.ec2.internal.18a720bc521369af default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-81.ec2.internal,UID:ip-10-0-135-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-135-81.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-135-81.ec2.internal,},FirstTimestamp:2026-04-17 11:16:11.223894447 +0000 UTC m=+0.393701502,LastTimestamp:2026-04-17 11:16:11.223894447 +0000 UTC m=+0.393701502,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-81.ec2.internal,}" Apr 17 11:16:11.252993 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.252924 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-81.ec2.internal.18a720bc5213910b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-81.ec2.internal,UID:ip-10-0-135-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-135-81.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-135-81.ec2.internal,},FirstTimestamp:2026-04-17 11:16:11.223904523 +0000 UTC m=+0.393711579,LastTimestamp:2026-04-17 11:16:11.223904523 +0000 UTC m=+0.393711579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-81.ec2.internal,}" Apr 17 11:16:11.266003 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.265985 2575 manager.go:341] "Starting Device Plugin manager" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.266033 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.266047 2575 server.go:85] "Starting device plugin registration server" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.266300 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.266310 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.266409 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.266494 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.266503 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.267025 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.267059 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.277781 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nj9vg" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.278655 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-81.ec2.internal.18a720bc54b8b828 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-81.ec2.internal,UID:ip-10-0-135-81.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-135-81.ec2.internal,},FirstTimestamp:2026-04-17 11:16:11.268282408 +0000 UTC m=+0.438089450,LastTimestamp:2026-04-17 11:16:11.268282408 +0000 UTC m=+0.438089450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-81.ec2.internal,}" Apr 17 11:16:11.290683 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.288256 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-nj9vg" Apr 17 11:16:11.337458 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.337428 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 11:16:11.337458 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.337463 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 11:16:11.337639 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.337484 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 11:16:11.337639 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.337491 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 11:16:11.337639 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.337523 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 11:16:11.349376 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.349351 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:11.366761 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.366695 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.367733 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.367705 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.367857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.367743 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.367857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.367761 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.367857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.367785 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.377571 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.377548 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.377629 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.377575 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-81.ec2.internal\": node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.403451 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.403422 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.438248 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.438217 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal"] Apr 17 11:16:11.438401 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.438299 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.441902 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.441881 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.441998 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.441916 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.441998 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.441927 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.444256 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.444242 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.444415 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.444400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.444459 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.444433 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.445129 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.445112 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.445209 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.445145 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.445209 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.445155 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.445209 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.445116 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.445296 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.445219 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.445296 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.445232 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.447467 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.447451 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.447575 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.447474 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 11:16:11.448204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.448188 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientMemory" Apr 17 11:16:11.448277 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.448215 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 11:16:11.448277 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.448227 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeHasSufficientPID" Apr 17 11:16:11.472179 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.472154 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-81.ec2.internal\" not found" node="ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.476621 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.476604 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-81.ec2.internal\" not found" node="ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.503778 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.503753 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.507068 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.507050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/912cde7e6552ffe80bcec67e0b4a9d7a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal\" (UID: \"912cde7e6552ffe80bcec67e0b4a9d7a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.507115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.507078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/912cde7e6552ffe80bcec67e0b4a9d7a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal\" (UID: \"912cde7e6552ffe80bcec67e0b4a9d7a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.507115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.507097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/94b45018193276143fc473b8ac9f9152-config\") pod \"kube-apiserver-proxy-ip-10-0-135-81.ec2.internal\" (UID: \"94b45018193276143fc473b8ac9f9152\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.604451 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.604418 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.607707 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.607687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/912cde7e6552ffe80bcec67e0b4a9d7a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal\" (UID: \"912cde7e6552ffe80bcec67e0b4a9d7a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.607751 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.607720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/912cde7e6552ffe80bcec67e0b4a9d7a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal\" (UID: \"912cde7e6552ffe80bcec67e0b4a9d7a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.607751 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.607744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/94b45018193276143fc473b8ac9f9152-config\") pod \"kube-apiserver-proxy-ip-10-0-135-81.ec2.internal\" (UID: \"94b45018193276143fc473b8ac9f9152\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.607814 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.607800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/94b45018193276143fc473b8ac9f9152-config\") pod \"kube-apiserver-proxy-ip-10-0-135-81.ec2.internal\" (UID: \"94b45018193276143fc473b8ac9f9152\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.607814 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.607806 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/912cde7e6552ffe80bcec67e0b4a9d7a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal\" (UID: \"912cde7e6552ffe80bcec67e0b4a9d7a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.607871 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.607800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/912cde7e6552ffe80bcec67e0b4a9d7a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal\" (UID: \"912cde7e6552ffe80bcec67e0b4a9d7a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.705105 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.705035 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.774482 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.774440 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.779253 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:11.779232 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" Apr 17 11:16:11.805928 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.805889 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:11.906399 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:11.906356 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.006794 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.006702 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.107258 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.107227 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.120857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.120837 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.126813 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.126787 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 11:16:12.126934 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.126897 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:12.126934 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.126903 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 11:16:12.203951 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.203921 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 11:16:12.207463 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.207431 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.220887 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.220863 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.226660 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.226642 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 11:16:12.255207 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.255179 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qn8mc" Apr 17 11:16:12.264442 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.264355 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qn8mc" Apr 17 11:16:12.290500 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.290459 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 11:11:11 +0000 UTC" deadline="2027-12-21 18:01:44.608748328 +0000 UTC" Apr 17 11:16:12.290500 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.290495 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14718h45m32.318256768s" Apr 17 11:16:12.307793 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.307767 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.370469 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:12.370431 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod912cde7e6552ffe80bcec67e0b4a9d7a.slice/crio-33aa7d32766dfbd0f88ba23763408dcf4791fd30d58ebe2936e848adb83447fa WatchSource:0}: Error finding container 33aa7d32766dfbd0f88ba23763408dcf4791fd30d58ebe2936e848adb83447fa: Status 404 returned error can't find the container with id 33aa7d32766dfbd0f88ba23763408dcf4791fd30d58ebe2936e848adb83447fa Apr 17 11:16:12.370811 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:12.370788 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b45018193276143fc473b8ac9f9152.slice/crio-c40a0ce33cb4b77f7cd4a2da04fba7a352e98785eab81e54703cf0cf5dd91f27 WatchSource:0}: Error finding container c40a0ce33cb4b77f7cd4a2da04fba7a352e98785eab81e54703cf0cf5dd91f27: Status 404 returned error can't find the container with id c40a0ce33cb4b77f7cd4a2da04fba7a352e98785eab81e54703cf0cf5dd91f27 Apr 17 11:16:12.374883 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.374863 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:16:12.408873 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.408841 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.509366 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.509321 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.609827 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.609799 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.710594 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:12.710552 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-81.ec2.internal\" not found" Apr 17 11:16:12.780573 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.780542 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:12.805643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.805606 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" Apr 17 11:16:12.819253 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.818965 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:12.819253 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.819121 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" Apr 17 11:16:12.829224 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:12.829188 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 11:16:13.185317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.185275 2575 apiserver.go:52] "Watching apiserver" Apr 17 11:16:13.191083 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.191057 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 11:16:13.191493 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.191466 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-tvn9d","openshift-network-operator/iptables-alerter-s9dkv","kube-system/konnectivity-agent-t7kcp","kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9","openshift-cluster-node-tuning-operator/tuned-f28wr","openshift-image-registry/node-ca-pms2x","openshift-multus/multus-xcwz5","openshift-network-diagnostics/network-check-target-cvt8g","openshift-ovn-kubernetes/ovnkube-node-bvdth","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal","openshift-multus/multus-additional-cni-plugins-4w8lp"] Apr 17 11:16:13.196214 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.196184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.196353 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.196268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:13.198251 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.198218 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.198487 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.198469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.200482 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.200459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.201457 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.201440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ct2lr\"" Apr 17 11:16:13.201738 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.201722 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.201940 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.201916 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.202264 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 11:16:13.202264 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gxl5d\"" Apr 17 11:16:13.202264 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202290 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 11:16:13.202512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 11:16:13.202606 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202573 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.202768 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202739 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.202768 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.202751 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h22lq\"" Apr 17 11:16:13.203222 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.203205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 11:16:13.204933 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.204903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.205022 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.204991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.206919 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.206903 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.207249 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.207229 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.207962 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.207944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.208081 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.208054 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 11:16:13.208081 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.208065 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.208222 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.208133 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-pb47b\"" Apr 17 11:16:13.208494 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.208474 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8pfgg\"" Apr 17 11:16:13.208594 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.208540 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.209110 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.209094 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 11:16:13.209461 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.209443 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.209549 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.209472 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:13.209605 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.209547 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:13.209660 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.209632 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jx9lb\"" Apr 17 11:16:13.211183 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.211154 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.211271 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.211255 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 11:16:13.211937 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.211916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.213944 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.213914 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 11:16:13.214347 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.214321 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 11:16:13.214455 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.214397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-var-lib-kubelet\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-hostroot\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215256 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-socket-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-device-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.215404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215405 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4fnqs\"" Apr 17 11:16:13.215762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215413 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-systemd\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.215762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 11:16:13.215762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12cc2cb3-5799-482e-9110-985521fc52eb-etc-tuned\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.215762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215636 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cc2cb3-5799-482e-9110-985521fc52eb-tmp\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.215940 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-cnibin\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.215940 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-k8s-cni-cncf-io\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.215940 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9ffw\" (UniqueName: \"kubernetes.io/projected/541df6c8-cc79-40aa-9b07-2084d74abdbd-kube-api-access-l9ffw\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.215940 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.215929 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 11:16:13.216108 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-registration-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.216159 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216122 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-run\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.216159 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-sys\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.216249 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216178 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 11:16:13.216366 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6zkp\" (UniqueName: \"kubernetes.io/projected/12cc2cb3-5799-482e-9110-985521fc52eb-kube-api-access-v6zkp\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.216434 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-os-release\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.216668 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216484 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-netns\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.216770 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bc04c84-5033-4522-bdb4-8ff714571072-host\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.216770 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-modprobe-d\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.216882 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216765 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-kubernetes\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.216935 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216880 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-cni-binary-copy\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.216935 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216889 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 11:16:13.217028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-conf-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-etc-kubernetes\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216961 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 11:16:13.217028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.216984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gd9t\" (UniqueName: \"kubernetes.io/projected/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-kube-api-access-9gd9t\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/541df6c8-cc79-40aa-9b07-2084d74abdbd-iptables-alerter-script\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.217204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-host\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.217204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217107 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-system-cni-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-multus-certs\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217394 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217228 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.217394 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f6877aa1-7b7b-4025-b8e9-f96f15bfab82-konnectivity-ca\") pod \"konnectivity-agent-t7kcp\" (UID: \"f6877aa1-7b7b-4025-b8e9-f96f15bfab82\") " pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.217394 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.217394 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysctl-conf\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.217563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-lib-modules\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.217563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-socket-dir-parent\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-cni-bin\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217538 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-daemon-config\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217738 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217612 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8bc04c84-5033-4522-bdb4-8ff714571072-serviceca\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.217738 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/541df6c8-cc79-40aa-9b07-2084d74abdbd-host-slash\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.217738 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2qfl\" (UniqueName: \"kubernetes.io/projected/6c231c4f-2413-4ea2-8e5e-1935448131ad-kube-api-access-d2qfl\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.217873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysconfig\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.217873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysctl-d\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.217873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-cni-multus\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.217873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f6877aa1-7b7b-4025-b8e9-f96f15bfab82-agent-certs\") pod \"konnectivity-agent-t7kcp\" (UID: \"f6877aa1-7b7b-4025-b8e9-f96f15bfab82\") " pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.218053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-sys-fs\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.218053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-cni-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.218053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-kubelet\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.218053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-x5tfn\"" Apr 17 11:16:13.218053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.217974 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg7kp\" (UniqueName: \"kubernetes.io/projected/d129dd20-5a5b-4718-8eca-2f10184defe9-kube-api-access-dg7kp\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.218053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.218014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpt6\" (UniqueName: \"kubernetes.io/projected/8bc04c84-5033-4522-bdb4-8ff714571072-kube-api-access-kbpt6\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.265284 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.265248 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:12 +0000 UTC" deadline="2027-11-27 08:44:49.65875059 +0000 UTC" Apr 17 11:16:13.265284 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.265282 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14133h28m36.393471858s" Apr 17 11:16:13.306902 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.306875 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 11:16:13.318554 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9ffw\" (UniqueName: \"kubernetes.io/projected/541df6c8-cc79-40aa-9b07-2084d74abdbd-kube-api-access-l9ffw\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.318698 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6zkp\" (UniqueName: \"kubernetes.io/projected/12cc2cb3-5799-482e-9110-985521fc52eb-kube-api-access-v6zkp\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.318698 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gd9t\" (UniqueName: \"kubernetes.io/projected/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-kube-api-access-9gd9t\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.318698 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccq4g\" (UniqueName: \"kubernetes.io/projected/652320e1-a7a1-4b18-a16c-59420fde1a03-kube-api-access-ccq4g\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.318698 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bc04c84-5033-4522-bdb4-8ff714571072-host\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-modprobe-d\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bc04c84-5033-4522-bdb4-8ff714571072-host\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318752 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-cni-binary-copy\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-conf-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-conf-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-modprobe-d\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-system-cni-dir\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.318904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318952 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/541df6c8-cc79-40aa-9b07-2084d74abdbd-iptables-alerter-script\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.318978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-host\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-system-cni-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-multus-certs\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-host\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-slash\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-etc-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-node-log\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.319201 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-systemd\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319222 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319231 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-multus-certs\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f6877aa1-7b7b-4025-b8e9-f96f15bfab82-konnectivity-ca\") pod \"konnectivity-agent-t7kcp\" (UID: \"f6877aa1-7b7b-4025-b8e9-f96f15bfab82\") " pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysctl-conf\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319322 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-lib-modules\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319378 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-cni-binary-copy\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.319393 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-system-cni-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.319514 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.819462291 +0000 UTC m=+2.989269334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.319570 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysctl-conf\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-lib-modules\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319611 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/541df6c8-cc79-40aa-9b07-2084d74abdbd-iptables-alerter-script\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8bc04c84-5033-4522-bdb4-8ff714571072-serviceca\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2qfl\" (UniqueName: \"kubernetes.io/projected/6c231c4f-2413-4ea2-8e5e-1935448131ad-kube-api-access-d2qfl\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysconfig\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysconfig\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysctl-d\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.319976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f6877aa1-7b7b-4025-b8e9-f96f15bfab82-agent-certs\") pod \"konnectivity-agent-t7kcp\" (UID: \"f6877aa1-7b7b-4025-b8e9-f96f15bfab82\") " pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.320055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320041 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-sys-fs\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320067 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-cni-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320072 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f6877aa1-7b7b-4025-b8e9-f96f15bfab82-konnectivity-ca\") pod \"konnectivity-agent-t7kcp\" (UID: \"f6877aa1-7b7b-4025-b8e9-f96f15bfab82\") " pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320094 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-kubelet\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320171 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8bc04c84-5033-4522-bdb4-8ff714571072-serviceca\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320191 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-ovnkube-script-lib\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320094 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-sysctl-d\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpt6\" (UniqueName: \"kubernetes.io/projected/8bc04c84-5033-4522-bdb4-8ff714571072-kube-api-access-kbpt6\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-kubelet\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-var-lib-kubelet\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/652320e1-a7a1-4b18-a16c-59420fde1a03-ovn-node-metrics-cert\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320359 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-socket-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-device-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320422 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-k8s-cni-cncf-io\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-cni-netd\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-registration-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.320622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-run\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-device-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320530 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320533 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-sys\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-sys-fs\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-var-lib-kubelet\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-os-release\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-cni-dir\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-etc-kubernetes\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-run\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320640 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-sys\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-k8s-cni-cncf-io\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-os-release\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-etc-kubernetes\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321143 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-ovn\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.320712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-registration-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-cnibin\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321255 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.321364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-kubernetes\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321319 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-kubelet\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321361 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-systemd-units\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-run-netns\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-kubernetes\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-socket-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-socket-dir-parent\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-cni-bin\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321544 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-daemon-config\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-socket-dir-parent\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-log-socket\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-cni-bin\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321622 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/541df6c8-cc79-40aa-9b07-2084d74abdbd-host-slash\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-cni-multus\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322151 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-cni-bin\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/541df6c8-cc79-40aa-9b07-2084d74abdbd-host-slash\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-var-lib-cni-multus\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dg7kp\" (UniqueName: \"kubernetes.io/projected/d129dd20-5a5b-4718-8eca-2f10184defe9-kube-api-access-dg7kp\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-hostroot\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321891 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-hostroot\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-var-lib-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.321948 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-ovnkube-config\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.322006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-env-overrides\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.322034 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-os-release\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.322776 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.322043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-multus-daemon-config\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.323224 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.323224 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c231c4f-2413-4ea2-8e5e-1935448131ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.323224 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-systemd\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.323224 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12cc2cb3-5799-482e-9110-985521fc52eb-etc-tuned\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.323808 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cc2cb3-5799-482e-9110-985521fc52eb-tmp\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.323918 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-cnibin\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.323918 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-netns\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.323918 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/12cc2cb3-5799-482e-9110-985521fc52eb-etc-systemd\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.323918 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.323882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2fzz\" (UniqueName: \"kubernetes.io/projected/b1d77920-3c64-40cf-82ce-24b1244a48e0-kube-api-access-x2fzz\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.324113 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.324038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-cnibin\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.324113 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.324097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-host-run-netns\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.324378 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.324333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f6877aa1-7b7b-4025-b8e9-f96f15bfab82-agent-certs\") pod \"konnectivity-agent-t7kcp\" (UID: \"f6877aa1-7b7b-4025-b8e9-f96f15bfab82\") " pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.326581 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.326125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/12cc2cb3-5799-482e-9110-985521fc52eb-tmp\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.328109 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.327754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/12cc2cb3-5799-482e-9110-985521fc52eb-etc-tuned\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.329604 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.329551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6zkp\" (UniqueName: \"kubernetes.io/projected/12cc2cb3-5799-482e-9110-985521fc52eb-kube-api-access-v6zkp\") pod \"tuned-f28wr\" (UID: \"12cc2cb3-5799-482e-9110-985521fc52eb\") " pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.330112 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.330071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9ffw\" (UniqueName: \"kubernetes.io/projected/541df6c8-cc79-40aa-9b07-2084d74abdbd-kube-api-access-l9ffw\") pod \"iptables-alerter-s9dkv\" (UID: \"541df6c8-cc79-40aa-9b07-2084d74abdbd\") " pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.331013 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.330854 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gd9t\" (UniqueName: \"kubernetes.io/projected/7f0f3aa1-28b4-49b7-8498-04ccddc9bacf-kube-api-access-9gd9t\") pod \"multus-xcwz5\" (UID: \"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf\") " pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.331643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.331317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2qfl\" (UniqueName: \"kubernetes.io/projected/6c231c4f-2413-4ea2-8e5e-1935448131ad-kube-api-access-d2qfl\") pod \"aws-ebs-csi-driver-node-8htm9\" (UID: \"6c231c4f-2413-4ea2-8e5e-1935448131ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.331643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.331606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg7kp\" (UniqueName: \"kubernetes.io/projected/d129dd20-5a5b-4718-8eca-2f10184defe9-kube-api-access-dg7kp\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.331643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.331629 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpt6\" (UniqueName: \"kubernetes.io/projected/8bc04c84-5033-4522-bdb4-8ff714571072-kube-api-access-kbpt6\") pod \"node-ca-pms2x\" (UID: \"8bc04c84-5033-4522-bdb4-8ff714571072\") " pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.342778 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.342731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" event={"ID":"94b45018193276143fc473b8ac9f9152","Type":"ContainerStarted","Data":"c40a0ce33cb4b77f7cd4a2da04fba7a352e98785eab81e54703cf0cf5dd91f27"} Apr 17 11:16:13.343711 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.343686 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" event={"ID":"912cde7e6552ffe80bcec67e0b4a9d7a","Type":"ContainerStarted","Data":"33aa7d32766dfbd0f88ba23763408dcf4791fd30d58ebe2936e848adb83447fa"} Apr 17 11:16:13.424467 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-log-socket\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424467 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-cni-bin\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-cni-bin\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-log-socket\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-var-lib-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-ovnkube-config\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-env-overrides\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424665 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-os-release\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.424694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-var-lib-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2fzz\" (UniqueName: \"kubernetes.io/projected/b1d77920-3c64-40cf-82ce-24b1244a48e0-kube-api-access-x2fzz\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ccq4g\" (UniqueName: \"kubernetes.io/projected/652320e1-a7a1-4b18-a16c-59420fde1a03-kube-api-access-ccq4g\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424771 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-system-cni-dir\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-slash\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-etc-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-node-log\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-systemd\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.424976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-ovnkube-script-lib\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-os-release\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425130 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-systemd\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/652320e1-a7a1-4b18-a16c-59420fde1a03-ovn-node-metrics-cert\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-cni-netd\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-ovn\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-cnibin\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425272 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-kubelet\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-systemd-units\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-run-netns\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425531 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-env-overrides\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-ovnkube-config\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.425667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-run-ovn-kubernetes\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-slash\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-etc-openvswitch\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-cni-binary-copy\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-cnibin\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/652320e1-a7a1-4b18-a16c-59420fde1a03-ovnkube-script-lib\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426388 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.425605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-cni-netd\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-run-ovn\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426433 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-run-netns\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426498 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-host-kubelet\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426538 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-node-log\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b1d77920-3c64-40cf-82ce-24b1244a48e0-system-cni-dir\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.426712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.426622 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/652320e1-a7a1-4b18-a16c-59420fde1a03-systemd-units\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.427296 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.427270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b1d77920-3c64-40cf-82ce-24b1244a48e0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.428666 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.428645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/652320e1-a7a1-4b18-a16c-59420fde1a03-ovn-node-metrics-cert\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.433311 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.433291 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:13.433441 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.433315 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:13.433441 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.433328 2575 projected.go:194] Error preparing data for projected volume kube-api-access-gft4k for pod openshift-network-diagnostics/network-check-target-cvt8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.433441 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.433409 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k podName:055f933b-358b-4058-aa0d-4808293e4549 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:13.933391441 +0000 UTC m=+3.103198494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gft4k" (UniqueName: "kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k") pod "network-check-target-cvt8g" (UID: "055f933b-358b-4058-aa0d-4808293e4549") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:13.435461 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.435416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccq4g\" (UniqueName: \"kubernetes.io/projected/652320e1-a7a1-4b18-a16c-59420fde1a03-kube-api-access-ccq4g\") pod \"ovnkube-node-bvdth\" (UID: \"652320e1-a7a1-4b18-a16c-59420fde1a03\") " pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.435617 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.435598 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2fzz\" (UniqueName: \"kubernetes.io/projected/b1d77920-3c64-40cf-82ce-24b1244a48e0-kube-api-access-x2fzz\") pod \"multus-additional-cni-plugins-4w8lp\" (UID: \"b1d77920-3c64-40cf-82ce-24b1244a48e0\") " pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.478807 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.478767 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:13.515886 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.515839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s9dkv" Apr 17 11:16:13.523807 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.523782 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:13.532005 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.531973 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" Apr 17 11:16:13.536700 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.536678 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-f28wr" Apr 17 11:16:13.543294 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.543273 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pms2x" Apr 17 11:16:13.548920 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.548901 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xcwz5" Apr 17 11:16:13.555560 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.555541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:13.561114 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.561094 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" Apr 17 11:16:13.828006 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:13.827919 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:13.828169 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.828065 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:13.828169 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:13.828142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:14.828123574 +0000 UTC m=+3.997930625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.020462 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.020436 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652320e1_a7a1_4b18_a16c_59420fde1a03.slice/crio-c6241934ae59059268e749269952ec6886c611d39f48efd9ebd2f66ce242f70f WatchSource:0}: Error finding container c6241934ae59059268e749269952ec6886c611d39f48efd9ebd2f66ce242f70f: Status 404 returned error can't find the container with id c6241934ae59059268e749269952ec6886c611d39f48efd9ebd2f66ce242f70f Apr 17 11:16:14.021194 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.021159 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod541df6c8_cc79_40aa_9b07_2084d74abdbd.slice/crio-ad4abdd8319c239fd26a8b4063d1e24fbaacd687a0a3b08e53738942845fa0cb WatchSource:0}: Error finding container ad4abdd8319c239fd26a8b4063d1e24fbaacd687a0a3b08e53738942845fa0cb: Status 404 returned error can't find the container with id ad4abdd8319c239fd26a8b4063d1e24fbaacd687a0a3b08e53738942845fa0cb Apr 17 11:16:14.022493 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.022388 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1d77920_3c64_40cf_82ce_24b1244a48e0.slice/crio-d8c61f53dda5811b6a6bce64f32de09e448418997cf59ab7bb933e36c9d61394 WatchSource:0}: Error finding container d8c61f53dda5811b6a6bce64f32de09e448418997cf59ab7bb933e36c9d61394: Status 404 returned error can't find the container with id d8c61f53dda5811b6a6bce64f32de09e448418997cf59ab7bb933e36c9d61394 Apr 17 11:16:14.023592 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.023560 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c231c4f_2413_4ea2_8e5e_1935448131ad.slice/crio-1950d8ab3867ae2802a52afda06dcceb20c35119454ef2602c7b06082fe60bc9 WatchSource:0}: Error finding container 1950d8ab3867ae2802a52afda06dcceb20c35119454ef2602c7b06082fe60bc9: Status 404 returned error can't find the container with id 1950d8ab3867ae2802a52afda06dcceb20c35119454ef2602c7b06082fe60bc9 Apr 17 11:16:14.026456 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.026312 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6877aa1_7b7b_4025_b8e9_f96f15bfab82.slice/crio-69d7f7eb9935d9b4ef22164376c68df15e281ad02a8b9e2d48b2838518f59c5d WatchSource:0}: Error finding container 69d7f7eb9935d9b4ef22164376c68df15e281ad02a8b9e2d48b2838518f59c5d: Status 404 returned error can't find the container with id 69d7f7eb9935d9b4ef22164376c68df15e281ad02a8b9e2d48b2838518f59c5d Apr 17 11:16:14.027370 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.027325 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f0f3aa1_28b4_49b7_8498_04ccddc9bacf.slice/crio-164c527d18d22b5cdfc7a20a84f6849c78dffac624d85260ecefd41b3d38bd05 WatchSource:0}: Error finding container 164c527d18d22b5cdfc7a20a84f6849c78dffac624d85260ecefd41b3d38bd05: Status 404 returned error can't find the container with id 164c527d18d22b5cdfc7a20a84f6849c78dffac624d85260ecefd41b3d38bd05 Apr 17 11:16:14.028170 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.028147 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bc04c84_5033_4522_bdb4_8ff714571072.slice/crio-5e1091309922380521df4cb80dbcf0369a0bc8e3287cbf279fd31adef4a19abf WatchSource:0}: Error finding container 5e1091309922380521df4cb80dbcf0369a0bc8e3287cbf279fd31adef4a19abf: Status 404 returned error can't find the container with id 5e1091309922380521df4cb80dbcf0369a0bc8e3287cbf279fd31adef4a19abf Apr 17 11:16:14.029025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.028996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:14.029201 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:14.029181 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:14.029270 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:14.029208 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:14.029270 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:14.029220 2575 projected.go:194] Error preparing data for projected volume kube-api-access-gft4k for pod openshift-network-diagnostics/network-check-target-cvt8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.029377 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:14.029274 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k podName:055f933b-358b-4058-aa0d-4808293e4549 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:15.029256535 +0000 UTC m=+4.199063592 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gft4k" (UniqueName: "kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k") pod "network-check-target-cvt8g" (UID: "055f933b-358b-4058-aa0d-4808293e4549") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:14.029796 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:14.029761 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12cc2cb3_5799_482e_9110_985521fc52eb.slice/crio-c94c211d83accf6d4a24a400e496d72d547328860f2bdd3134e3fba0e4325789 WatchSource:0}: Error finding container c94c211d83accf6d4a24a400e496d72d547328860f2bdd3134e3fba0e4325789: Status 404 returned error can't find the container with id c94c211d83accf6d4a24a400e496d72d547328860f2bdd3134e3fba0e4325789 Apr 17 11:16:14.257542 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.256547 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 11:16:14.265935 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.265898 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 11:11:12 +0000 UTC" deadline="2027-12-04 07:34:24.657439422 +0000 UTC" Apr 17 11:16:14.265935 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.265929 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14300h18m10.391513163s" Apr 17 11:16:14.349571 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.349438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" event={"ID":"6c231c4f-2413-4ea2-8e5e-1935448131ad","Type":"ContainerStarted","Data":"1950d8ab3867ae2802a52afda06dcceb20c35119454ef2602c7b06082fe60bc9"} Apr 17 11:16:14.351984 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.351954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"c6241934ae59059268e749269952ec6886c611d39f48efd9ebd2f66ce242f70f"} Apr 17 11:16:14.353811 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.353783 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcwz5" event={"ID":"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf","Type":"ContainerStarted","Data":"164c527d18d22b5cdfc7a20a84f6849c78dffac624d85260ecefd41b3d38bd05"} Apr 17 11:16:14.356792 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.356751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-f28wr" event={"ID":"12cc2cb3-5799-482e-9110-985521fc52eb","Type":"ContainerStarted","Data":"c94c211d83accf6d4a24a400e496d72d547328860f2bdd3134e3fba0e4325789"} Apr 17 11:16:14.358948 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.358212 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t7kcp" event={"ID":"f6877aa1-7b7b-4025-b8e9-f96f15bfab82","Type":"ContainerStarted","Data":"69d7f7eb9935d9b4ef22164376c68df15e281ad02a8b9e2d48b2838518f59c5d"} Apr 17 11:16:14.360047 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.360025 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerStarted","Data":"d8c61f53dda5811b6a6bce64f32de09e448418997cf59ab7bb933e36c9d61394"} Apr 17 11:16:14.363258 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.363124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" event={"ID":"94b45018193276143fc473b8ac9f9152","Type":"ContainerStarted","Data":"4e16a62f103eb02466da5ebd79f44dc19976fe5f0b3bb17c0f2d4028858d1076"} Apr 17 11:16:14.364575 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.364527 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pms2x" event={"ID":"8bc04c84-5033-4522-bdb4-8ff714571072","Type":"ContainerStarted","Data":"5e1091309922380521df4cb80dbcf0369a0bc8e3287cbf279fd31adef4a19abf"} Apr 17 11:16:14.366289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.366262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s9dkv" event={"ID":"541df6c8-cc79-40aa-9b07-2084d74abdbd","Type":"ContainerStarted","Data":"ad4abdd8319c239fd26a8b4063d1e24fbaacd687a0a3b08e53738942845fa0cb"} Apr 17 11:16:14.381825 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.381633 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-81.ec2.internal" podStartSLOduration=2.38161785 podStartE2EDuration="2.38161785s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:14.380861395 +0000 UTC m=+3.550668467" watchObservedRunningTime="2026-04-17 11:16:14.38161785 +0000 UTC m=+3.551424915" Apr 17 11:16:14.835576 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.835490 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:14.835740 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:14.835656 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.835740 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:14.835722 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:16.835702751 +0000 UTC m=+6.005509808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:14.872530 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.871789 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-4r4xj"] Apr 17 11:16:14.875971 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.874705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:14.878722 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.877781 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-2sgfg\"" Apr 17 11:16:14.878722 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.878008 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 11:16:14.878722 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.878192 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 11:16:14.936491 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.936239 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrshh\" (UniqueName: \"kubernetes.io/projected/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-kube-api-access-nrshh\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:14.936491 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.936294 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-tmp-dir\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:14.936491 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:14.936402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-hosts-file\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.037659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrshh\" (UniqueName: \"kubernetes.io/projected/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-kube-api-access-nrshh\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.037718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-tmp-dir\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.037770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.037825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-hosts-file\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.037912 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-hosts-file\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:15.039416 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:15.039439 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:15.039453 2575 projected.go:194] Error preparing data for projected volume kube-api-access-gft4k for pod openshift-network-diagnostics/network-check-target-cvt8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:15.039842 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:15.039520 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k podName:055f933b-358b-4058-aa0d-4808293e4549 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:17.039501056 +0000 UTC m=+6.209308117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gft4k" (UniqueName: "kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k") pod "network-check-target-cvt8g" (UID: "055f933b-358b-4058-aa0d-4808293e4549") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:15.040823 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.040800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-tmp-dir\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.061030 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.060996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrshh\" (UniqueName: \"kubernetes.io/projected/ec79fc9e-eb92-4d6d-9ea6-2a309575b035-kube-api-access-nrshh\") pod \"node-resolver-4r4xj\" (UID: \"ec79fc9e-eb92-4d6d-9ea6-2a309575b035\") " pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.188113 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.188077 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4r4xj" Apr 17 11:16:15.340998 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.340028 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:15.340998 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:15.340162 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:15.340998 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.340603 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:15.340998 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:15.340696 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:15.379899 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.379760 2575 generic.go:358] "Generic (PLEG): container finished" podID="912cde7e6552ffe80bcec67e0b4a9d7a" containerID="ed851469eb6c560fa9cc71b45e41185347cc48e47e1568f4e1e62961cfaf606c" exitCode=0 Apr 17 11:16:15.379899 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.379832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" event={"ID":"912cde7e6552ffe80bcec67e0b4a9d7a","Type":"ContainerDied","Data":"ed851469eb6c560fa9cc71b45e41185347cc48e47e1568f4e1e62961cfaf606c"} Apr 17 11:16:15.386688 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:15.386658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4r4xj" event={"ID":"ec79fc9e-eb92-4d6d-9ea6-2a309575b035","Type":"ContainerStarted","Data":"ea9658306ae654b144726641eec90e893df8ca77eb79fdb0b6135a7a118c6b3d"} Apr 17 11:16:16.395358 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:16.395275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" event={"ID":"912cde7e6552ffe80bcec67e0b4a9d7a","Type":"ContainerStarted","Data":"9d863b85f3aba9b492133a257c20c75913b208c3a3beb5f474f8a4758dbacced"} Apr 17 11:16:16.852354 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:16.852297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:16.852606 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:16.852459 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:16.852606 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:16.852536 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:20.852514083 +0000 UTC m=+10.022321129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:17.053541 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:17.053501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:17.053746 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:17.053725 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:17.053818 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:17.053752 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:17.053818 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:17.053767 2575 projected.go:194] Error preparing data for projected volume kube-api-access-gft4k for pod openshift-network-diagnostics/network-check-target-cvt8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:17.053894 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:17.053836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k podName:055f933b-358b-4058-aa0d-4808293e4549 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:21.053817508 +0000 UTC m=+10.223624558 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gft4k" (UniqueName: "kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k") pod "network-check-target-cvt8g" (UID: "055f933b-358b-4058-aa0d-4808293e4549") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:17.339563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:17.339526 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:17.339563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:17.339561 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:17.339802 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:17.339682 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:17.339872 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:17.339821 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:19.338517 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:19.338460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:19.338945 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:19.338597 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:19.338945 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:19.338460 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:19.338945 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:19.338722 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:20.887132 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:20.887089 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:20.887627 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:20.887270 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:20.887627 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:20.887357 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:28.887321876 +0000 UTC m=+18.057128919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:21.088689 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:21.088586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:21.088862 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:21.088786 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:21.088862 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:21.088817 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:21.088862 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:21.088831 2575 projected.go:194] Error preparing data for projected volume kube-api-access-gft4k for pod openshift-network-diagnostics/network-check-target-cvt8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:21.089034 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:21.088898 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k podName:055f933b-358b-4058-aa0d-4808293e4549 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:29.088879869 +0000 UTC m=+18.258686918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gft4k" (UniqueName: "kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k") pod "network-check-target-cvt8g" (UID: "055f933b-358b-4058-aa0d-4808293e4549") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:21.339028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:21.338991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:21.339197 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:21.339112 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:21.339197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:21.339131 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:21.339316 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:21.339247 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:23.337909 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:23.337867 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:23.338397 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:23.338027 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:23.338397 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:23.338081 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:23.338397 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:23.338189 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:25.338483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:25.338442 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:25.338483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:25.338476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:25.338955 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:25.338587 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:25.338955 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:25.338700 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:27.338719 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:27.338686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:27.339182 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:27.338686 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:27.339182 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:27.338831 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:27.339182 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:27.338925 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:28.940606 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:28.940556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:28.941190 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:28.940716 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:28.941190 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:28.940797 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.940778431 +0000 UTC m=+34.110585485 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:29.142241 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:29.142198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:29.142425 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:29.142404 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 11:16:29.142472 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:29.142431 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 11:16:29.142472 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:29.142445 2575 projected.go:194] Error preparing data for projected volume kube-api-access-gft4k for pod openshift-network-diagnostics/network-check-target-cvt8g: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:29.142551 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:29.142513 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k podName:055f933b-358b-4058-aa0d-4808293e4549 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.142493222 +0000 UTC m=+34.312300267 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gft4k" (UniqueName: "kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k") pod "network-check-target-cvt8g" (UID: "055f933b-358b-4058-aa0d-4808293e4549") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 11:16:29.338483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:29.338407 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:29.338483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:29.338453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:29.338733 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:29.338553 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:29.338733 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:29.338672 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:31.339963 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:31.339464 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:31.339963 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:31.339583 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:31.339963 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:31.339695 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:31.339963 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:31.339815 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:31.971839 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:31.971535 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-81.ec2.internal" podStartSLOduration=19.971518022 podStartE2EDuration="19.971518022s" podCreationTimestamp="2026-04-17 11:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:16:16.408324598 +0000 UTC m=+5.578131667" watchObservedRunningTime="2026-04-17 11:16:31.971518022 +0000 UTC m=+21.141325087" Apr 17 11:16:31.972007 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:31.971994 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hddsl"] Apr 17 11:16:31.974584 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:31.974563 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:31.974790 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:31.974770 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:32.062332 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.062298 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8adcdab8-9195-4dbb-838d-3ac5065f81ed-dbus\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.062504 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.062438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8adcdab8-9195-4dbb-838d-3ac5065f81ed-kubelet-config\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.062504 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.062476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.163813 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.163777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8adcdab8-9195-4dbb-838d-3ac5065f81ed-dbus\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.163965 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.163931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8adcdab8-9195-4dbb-838d-3ac5065f81ed-kubelet-config\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.163965 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.163957 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.164037 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.163957 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8adcdab8-9195-4dbb-838d-3ac5065f81ed-dbus\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.164037 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.164030 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8adcdab8-9195-4dbb-838d-3ac5065f81ed-kubelet-config\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.164093 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:32.164057 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:32.164122 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:32.164112 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret podName:8adcdab8-9195-4dbb-838d-3ac5065f81ed nodeName:}" failed. No retries permitted until 2026-04-17 11:16:32.664094004 +0000 UTC m=+21.833901047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret") pod "global-pull-secret-syncer-hddsl" (UID: "8adcdab8-9195-4dbb-838d-3ac5065f81ed") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:32.424259 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.424224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcwz5" event={"ID":"7f0f3aa1-28b4-49b7-8498-04ccddc9bacf","Type":"ContainerStarted","Data":"8a9401f3aa7a37e8487f0b6d52310cfbb913637a83651785b9c9f56aa8c408c4"} Apr 17 11:16:32.426028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.426000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-f28wr" event={"ID":"12cc2cb3-5799-482e-9110-985521fc52eb","Type":"ContainerStarted","Data":"b7e8db6d13674c7784ee66b8dadd315ff82a0c7755f1ca526cabd6e343fee30f"} Apr 17 11:16:32.427724 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.427674 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t7kcp" event={"ID":"f6877aa1-7b7b-4025-b8e9-f96f15bfab82","Type":"ContainerStarted","Data":"adf8c4d73b73acf13e445653f5df37ba34dfc9e8419e5428fd9e3952cb14fe3d"} Apr 17 11:16:32.429192 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.429170 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1d77920-3c64-40cf-82ce-24b1244a48e0" containerID="ed094100683d203884a980fde362c9a10cc6247580536b676f786bb19405e65f" exitCode=0 Apr 17 11:16:32.429316 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.429250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerDied","Data":"ed094100683d203884a980fde362c9a10cc6247580536b676f786bb19405e65f"} Apr 17 11:16:32.430826 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.430741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pms2x" event={"ID":"8bc04c84-5033-4522-bdb4-8ff714571072","Type":"ContainerStarted","Data":"6546c1112246e8c85ddd2eedb4f2ef49fc70a3acdde5f6f2dc8d458fd2e6b094"} Apr 17 11:16:32.432035 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.432012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" event={"ID":"6c231c4f-2413-4ea2-8e5e-1935448131ad","Type":"ContainerStarted","Data":"d91c6dc4eeb6f917380ad2c7eec8208d14532280d2993b39b61eb7e76cac48e1"} Apr 17 11:16:32.434606 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434587 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:16:32.434865 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434846 2575 generic.go:358] "Generic (PLEG): container finished" podID="652320e1-a7a1-4b18-a16c-59420fde1a03" containerID="317e70d72b132d77558af669d1fe7369ed4eda478da1d6a33f70e3ebe4d72baa" exitCode=1 Apr 17 11:16:32.434936 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"0987c96f935193830d787b63bb7b0403353bae934a33d7892aabab7f658c9d76"} Apr 17 11:16:32.434936 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"8b4f9d10c80dcc9fe0b9865cda381c5ce844a67b00b64cdb862e5b9c812a31f6"} Apr 17 11:16:32.434936 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"e099c1f1406bb0d1a7baada264e588f2e040e3d3d9b08f71f2eb05bffbac0550"} Apr 17 11:16:32.435104 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434945 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"725298cd8650b63b3af1864782e7cd473725aabd006b5e74e517f73729201c12"} Apr 17 11:16:32.435104 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerDied","Data":"317e70d72b132d77558af669d1fe7369ed4eda478da1d6a33f70e3ebe4d72baa"} Apr 17 11:16:32.435104 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.434972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"b96a2423ec86900f3a6b2a341f6eb53e4693aee8143350eab1728648f90fcfcb"} Apr 17 11:16:32.436895 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.436872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4r4xj" event={"ID":"ec79fc9e-eb92-4d6d-9ea6-2a309575b035","Type":"ContainerStarted","Data":"d6e588092771236c9a022a6dfbc1037525bcbd96ade7b120615caed893d1c333"} Apr 17 11:16:32.440200 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.440165 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xcwz5" podStartSLOduration=3.9341820739999998 podStartE2EDuration="21.440154571s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.029264217 +0000 UTC m=+3.199071264" lastFinishedPulling="2026-04-17 11:16:31.535236714 +0000 UTC m=+20.705043761" observedRunningTime="2026-04-17 11:16:32.439607703 +0000 UTC m=+21.609414768" watchObservedRunningTime="2026-04-17 11:16:32.440154571 +0000 UTC m=+21.609961635" Apr 17 11:16:32.470580 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.470528 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4r4xj" podStartSLOduration=2.361367421 podStartE2EDuration="18.470513391s" podCreationTimestamp="2026-04-17 11:16:14 +0000 UTC" firstStartedPulling="2026-04-17 11:16:15.230761235 +0000 UTC m=+4.400568289" lastFinishedPulling="2026-04-17 11:16:31.339907204 +0000 UTC m=+20.509714259" observedRunningTime="2026-04-17 11:16:32.470287478 +0000 UTC m=+21.640094788" watchObservedRunningTime="2026-04-17 11:16:32.470513391 +0000 UTC m=+21.640320456" Apr 17 11:16:32.484652 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.484615 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t7kcp" podStartSLOduration=12.488110421 podStartE2EDuration="21.484601414s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.028797718 +0000 UTC m=+3.198604766" lastFinishedPulling="2026-04-17 11:16:23.025288713 +0000 UTC m=+12.195095759" observedRunningTime="2026-04-17 11:16:32.484331392 +0000 UTC m=+21.654138456" watchObservedRunningTime="2026-04-17 11:16:32.484601414 +0000 UTC m=+21.654408479" Apr 17 11:16:32.498398 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.498357 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-f28wr" podStartSLOduration=4.188613083 podStartE2EDuration="21.498324368s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.032015861 +0000 UTC m=+3.201822910" lastFinishedPulling="2026-04-17 11:16:31.341727149 +0000 UTC m=+20.511534195" observedRunningTime="2026-04-17 11:16:32.497870041 +0000 UTC m=+21.667677107" watchObservedRunningTime="2026-04-17 11:16:32.498324368 +0000 UTC m=+21.668131433" Apr 17 11:16:32.522926 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.522901 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 11:16:32.668935 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:32.668855 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:32.669063 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:32.668965 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:32.669063 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:32.669014 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret podName:8adcdab8-9195-4dbb-838d-3ac5065f81ed nodeName:}" failed. No retries permitted until 2026-04-17 11:16:33.669002699 +0000 UTC m=+22.838809742 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret") pod "global-pull-secret-syncer-hddsl" (UID: "8adcdab8-9195-4dbb-838d-3ac5065f81ed") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:33.278407 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.278269 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T11:16:32.522920118Z","UUID":"39cb11b4-415d-4cfa-8488-ef9a94966a58","Handler":null,"Name":"","Endpoint":""} Apr 17 11:16:33.280037 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.280016 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 11:16:33.280037 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.280046 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 11:16:33.338086 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.338055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:33.338239 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.338055 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:33.338239 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:33.338188 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:33.338369 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:33.338276 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:33.338369 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.338056 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:33.338482 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:33.338386 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:33.439674 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.439635 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s9dkv" event={"ID":"541df6c8-cc79-40aa-9b07-2084d74abdbd","Type":"ContainerStarted","Data":"b1e49438515232f326bd86454d4a3dc5848bb5f15db094d2e57a2e5cdd4a3fd2"} Apr 17 11:16:33.441775 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.441743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" event={"ID":"6c231c4f-2413-4ea2-8e5e-1935448131ad","Type":"ContainerStarted","Data":"e71c21becc145561bc6baa4945fde6efaa4013eafdbe1368708490709764beb1"} Apr 17 11:16:33.441901 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.441781 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" event={"ID":"6c231c4f-2413-4ea2-8e5e-1935448131ad","Type":"ContainerStarted","Data":"36d79096132068a486798dc1e353f8c2889ba486d2d24e6655272f49f6d415a6"} Apr 17 11:16:33.466477 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.466427 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pms2x" podStartSLOduration=5.156774118 podStartE2EDuration="22.466410493s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.030624484 +0000 UTC m=+3.200431528" lastFinishedPulling="2026-04-17 11:16:31.340260847 +0000 UTC m=+20.510067903" observedRunningTime="2026-04-17 11:16:32.514332119 +0000 UTC m=+21.684139184" watchObservedRunningTime="2026-04-17 11:16:33.466410493 +0000 UTC m=+22.636217558" Apr 17 11:16:33.466976 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.466939 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s9dkv" podStartSLOduration=5.150987254 podStartE2EDuration="22.466929628s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.024286517 +0000 UTC m=+3.194093575" lastFinishedPulling="2026-04-17 11:16:31.340228899 +0000 UTC m=+20.510035949" observedRunningTime="2026-04-17 11:16:33.46615774 +0000 UTC m=+22.635964809" watchObservedRunningTime="2026-04-17 11:16:33.466929628 +0000 UTC m=+22.636736692" Apr 17 11:16:33.482200 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.482157 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8htm9" podStartSLOduration=3.214319576 podStartE2EDuration="22.482143568s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.026594304 +0000 UTC m=+3.196401350" lastFinishedPulling="2026-04-17 11:16:33.294418282 +0000 UTC m=+22.464225342" observedRunningTime="2026-04-17 11:16:33.482056752 +0000 UTC m=+22.651863818" watchObservedRunningTime="2026-04-17 11:16:33.482143568 +0000 UTC m=+22.651950632" Apr 17 11:16:33.677070 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:33.677020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:33.677236 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:33.677189 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:33.677309 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:33.677268 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret podName:8adcdab8-9195-4dbb-838d-3ac5065f81ed nodeName:}" failed. No retries permitted until 2026-04-17 11:16:35.677247409 +0000 UTC m=+24.847054467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret") pod "global-pull-secret-syncer-hddsl" (UID: "8adcdab8-9195-4dbb-838d-3ac5065f81ed") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:34.448330 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:34.448297 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:16:34.448792 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:34.448758 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"e86d299c829641f288ef9e7acabd1721ac5cbf48cc30bbb20abb26cf65756e88"} Apr 17 11:16:35.338625 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:35.338435 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:35.338792 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:35.338482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:35.338792 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:35.338732 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:35.338792 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:35.338499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:35.338994 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:35.338815 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:35.338994 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:35.338875 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:35.691031 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:35.690946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:35.691520 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:35.691115 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:35.691520 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:35.691180 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret podName:8adcdab8-9195-4dbb-838d-3ac5065f81ed nodeName:}" failed. No retries permitted until 2026-04-17 11:16:39.69116639 +0000 UTC m=+28.860973438 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret") pod "global-pull-secret-syncer-hddsl" (UID: "8adcdab8-9195-4dbb-838d-3ac5065f81ed") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:36.178109 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.178060 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:36.178802 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.178771 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:36.457311 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.457262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:16:36.457749 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.457719 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"7bc1aff84a766670fb49fdbe8cc94eeb65a18aba25a6d10fa0c91b6fa756cba7"} Apr 17 11:16:36.458055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.458008 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:36.458055 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.458040 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:36.458213 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.458188 2575 scope.go:117] "RemoveContainer" containerID="317e70d72b132d77558af669d1fe7369ed4eda478da1d6a33f70e3ebe4d72baa" Apr 17 11:16:36.477556 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.477527 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:36.477665 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:36.477627 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:37.338856 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.338675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:37.339500 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.338744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:37.339500 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:37.338939 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:37.339500 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.338764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:37.339500 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:37.339043 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:37.339500 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:37.339096 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:37.462932 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.462907 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:16:37.463231 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.463207 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" event={"ID":"652320e1-a7a1-4b18-a16c-59420fde1a03","Type":"ContainerStarted","Data":"714086b8ff929acbd9b31395067d8bce3650eacfc9d8f832d60cbe1c0e6d68c6"} Apr 17 11:16:37.463311 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.463294 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:37.464701 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.464673 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1d77920-3c64-40cf-82ce-24b1244a48e0" containerID="c9e95b2e6aae3096ed4ddc65b75a63a74de7a1fdf626c6ecaa48a9b95a88e82e" exitCode=0 Apr 17 11:16:37.464812 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.464728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerDied","Data":"c9e95b2e6aae3096ed4ddc65b75a63a74de7a1fdf626c6ecaa48a9b95a88e82e"} Apr 17 11:16:37.499090 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:37.497886 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" podStartSLOduration=9.14105104 podStartE2EDuration="26.49786766s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.022940236 +0000 UTC m=+3.192747282" lastFinishedPulling="2026-04-17 11:16:31.379756854 +0000 UTC m=+20.549563902" observedRunningTime="2026-04-17 11:16:37.496939453 +0000 UTC m=+26.666746519" watchObservedRunningTime="2026-04-17 11:16:37.49786766 +0000 UTC m=+26.667674726" Apr 17 11:16:38.303374 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.303326 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hddsl"] Apr 17 11:16:38.303513 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.303466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:38.303570 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:38.303551 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:38.308424 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.308399 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tvn9d"] Apr 17 11:16:38.308536 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.308506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:38.308604 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:38.308588 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:38.317277 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.317243 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cvt8g"] Apr 17 11:16:38.317424 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.317378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:38.317499 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:38.317474 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:38.468787 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.468697 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1d77920-3c64-40cf-82ce-24b1244a48e0" containerID="5cf74e2a06081034aab9ec33948df34bc2d7271762f62aca74f7a694bb8f2385" exitCode=0 Apr 17 11:16:38.468787 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.468770 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerDied","Data":"5cf74e2a06081034aab9ec33948df34bc2d7271762f62aca74f7a694bb8f2385"} Apr 17 11:16:38.469188 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.468951 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:38.791094 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:38.791006 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:16:39.338749 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.338665 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:39.338906 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:39.338788 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:39.338906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.338848 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:39.339015 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:39.338951 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:39.472313 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.472280 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1d77920-3c64-40cf-82ce-24b1244a48e0" containerID="3ff2944fcfcc511ff315662ecfacafb7bb8392877d6df5ac601e76a38cb3df78" exitCode=0 Apr 17 11:16:39.472789 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.472374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerDied","Data":"3ff2944fcfcc511ff315662ecfacafb7bb8392877d6df5ac601e76a38cb3df78"} Apr 17 11:16:39.720844 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.720753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:39.720997 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:39.720898 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:39.720997 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:39.720965 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret podName:8adcdab8-9195-4dbb-838d-3ac5065f81ed nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.72094939 +0000 UTC m=+36.890756440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret") pod "global-pull-secret-syncer-hddsl" (UID: "8adcdab8-9195-4dbb-838d-3ac5065f81ed") : object "kube-system"/"original-pull-secret" not registered Apr 17 11:16:39.984260 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.984175 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:39.984454 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.984355 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 11:16:39.984953 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:39.984922 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t7kcp" Apr 17 11:16:40.338551 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:40.338522 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:40.338725 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:40.338626 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:41.339634 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:41.339595 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:41.340079 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:41.339731 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:41.340079 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:41.339786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:41.340079 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:41.339916 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:42.338781 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:42.338580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:42.338964 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:42.338874 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-cvt8g" podUID="055f933b-358b-4058-aa0d-4808293e4549" Apr 17 11:16:43.338140 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:43.338100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:43.338140 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:43.338130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:43.338688 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:43.338244 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:16:43.338688 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:43.338392 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hddsl" podUID="8adcdab8-9195-4dbb-838d-3ac5065f81ed" Apr 17 11:16:44.105564 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.105537 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-81.ec2.internal" event="NodeReady" Apr 17 11:16:44.105769 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.105690 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 11:16:44.144715 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.144678 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-78b7768865-zkfjk"] Apr 17 11:16:44.172088 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.172054 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ngx5g"] Apr 17 11:16:44.172260 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.172222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.175442 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.175243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 11:16:44.175442 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.175312 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 11:16:44.177540 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.177365 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 11:16:44.177540 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.177393 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-qf8qh\"" Apr 17 11:16:44.181965 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.181770 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 11:16:44.184912 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.184890 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-78b7768865-zkfjk"] Apr 17 11:16:44.185026 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.184921 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7dssr"] Apr 17 11:16:44.186854 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.185302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.188441 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.188417 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 11:16:44.189263 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.189058 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 11:16:44.189396 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.189353 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v5zg9\"" Apr 17 11:16:44.203172 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.203144 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ngx5g"] Apr 17 11:16:44.203172 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.203177 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7dssr"] Apr 17 11:16:44.203382 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.203301 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.205664 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.205642 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 11:16:44.206747 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.206723 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5xzqf\"" Apr 17 11:16:44.208230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.207826 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.208230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.207960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.256274 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256243 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-ca-trust-extracted\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256465 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-image-registry-private-configuration\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256465 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256309 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-trusted-ca\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256465 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.256465 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-tmp-dir\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.256465 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256422 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-config-volume\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.256744 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256504 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztbn5\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-kube-api-access-ztbn5\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256744 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzk2f\" (UniqueName: \"kubernetes.io/projected/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-kube-api-access-xzk2f\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.256744 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256744 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-bound-sa-token\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256744 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256716 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-installation-pull-secrets\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.256976 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.256768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-certificates\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.337708 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.337675 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:44.342419 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.342396 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gg44j\"" Apr 17 11:16:44.342959 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.342425 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 11:16:44.342959 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.342494 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 11:16:44.357785 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-installation-pull-secrets\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.357906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-certificates\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.357906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-ca-trust-extracted\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.357906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.358085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-image-registry-private-configuration\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-trusted-ca\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.357990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.358085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358017 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-tmp-dir\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.358085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmjb\" (UniqueName: \"kubernetes.io/projected/3d82faa5-4dae-416a-8e2c-337d3966cdd0-kube-api-access-psmjb\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.358289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-config-volume\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.358289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztbn5\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-kube-api-access-ztbn5\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzk2f\" (UniqueName: \"kubernetes.io/projected/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-kube-api-access-xzk2f\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.358289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358218 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-bound-sa-token\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358556 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-ca-trust-extracted\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358556 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358515 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-certificates\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.358720 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.358625 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:44.358720 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.358643 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:16:44.358720 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.358712 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.858693469 +0000 UTC m=+34.028500513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:16:44.358885 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.358826 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:44.358885 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.358880 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.858866611 +0000 UTC m=+34.028673658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:16:44.358992 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-tmp-dir\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.358992 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-trusted-ca\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.359098 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.358995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-config-volume\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.362660 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.362639 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-image-registry-private-configuration\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.362660 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.362652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-installation-pull-secrets\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.371748 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.371727 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-bound-sa-token\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.386209 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.386174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztbn5\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-kube-api-access-ztbn5\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.388481 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.388456 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzk2f\" (UniqueName: \"kubernetes.io/projected/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-kube-api-access-xzk2f\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.458621 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.458579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psmjb\" (UniqueName: \"kubernetes.io/projected/3d82faa5-4dae-416a-8e2c-337d3966cdd0-kube-api-access-psmjb\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.458808 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.458715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.458925 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.458837 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:44.458925 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.458910 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:44.958894154 +0000 UTC m=+34.128701202 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:16:44.471793 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.471766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmjb\" (UniqueName: \"kubernetes.io/projected/3d82faa5-4dae-416a-8e2c-337d3966cdd0-kube-api-access-psmjb\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.861484 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.861445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:44.861733 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.861573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:44.861733 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.861616 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:44.861733 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.861639 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:16:44.861733 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.861683 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:44.861733 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.861698 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.861680702 +0000 UTC m=+35.031487745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:16:44.861919 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.861784 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.861768906 +0000 UTC m=+35.031575955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:16:44.961956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.961914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:44.961956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:44.961963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:44.962169 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.962055 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:44.962169 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.962060 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:44.962169 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.962113 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:45.962098286 +0000 UTC m=+35.131905330 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:16:44.962275 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:44.962181 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:16.962165042 +0000 UTC m=+66.131972090 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 11:16:45.164050 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.163974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:45.166369 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.166333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gft4k\" (UniqueName: \"kubernetes.io/projected/055f933b-358b-4058-aa0d-4808293e4549-kube-api-access-gft4k\") pod \"network-check-target-cvt8g\" (UID: \"055f933b-358b-4058-aa0d-4808293e4549\") " pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:45.248450 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.248416 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:45.337932 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.337908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:45.338111 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.338094 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:16:45.340583 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.340560 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 11:16:45.341045 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.341025 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:16:45.341150 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.341138 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ntppc\"" Apr 17 11:16:45.495646 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.495450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-cvt8g"] Apr 17 11:16:45.498321 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:45.498291 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod055f933b_358b_4058_aa0d_4808293e4549.slice/crio-9b72026042ac2d76c4e0dd21fc19a6380c7788fec5d1deee79352fdee51da616 WatchSource:0}: Error finding container 9b72026042ac2d76c4e0dd21fc19a6380c7788fec5d1deee79352fdee51da616: Status 404 returned error can't find the container with id 9b72026042ac2d76c4e0dd21fc19a6380c7788fec5d1deee79352fdee51da616 Apr 17 11:16:45.869154 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.869111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:45.869373 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.869230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:45.869373 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.869291 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:45.869373 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.869314 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:16:45.869373 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.869358 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:45.869578 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.869399 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.869376871 +0000 UTC m=+37.039183923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:16:45.869578 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.869418 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.869408534 +0000 UTC m=+37.039215584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:16:45.970314 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:45.970277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:45.970500 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.970471 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:45.970557 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:45.970538 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:47.97052238 +0000 UTC m=+37.140329424 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:16:46.486721 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:46.486684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cvt8g" event={"ID":"055f933b-358b-4058-aa0d-4808293e4549","Type":"ContainerStarted","Data":"9b72026042ac2d76c4e0dd21fc19a6380c7788fec5d1deee79352fdee51da616"} Apr 17 11:16:46.489528 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:46.489501 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1d77920-3c64-40cf-82ce-24b1244a48e0" containerID="0fc5f3d371dc81e177bbc3769b6386aed9fd23a32c45cc272393b2df06665787" exitCode=0 Apr 17 11:16:46.489680 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:46.489550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerDied","Data":"0fc5f3d371dc81e177bbc3769b6386aed9fd23a32c45cc272393b2df06665787"} Apr 17 11:16:47.494714 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.494673 2575 generic.go:358] "Generic (PLEG): container finished" podID="b1d77920-3c64-40cf-82ce-24b1244a48e0" containerID="8f0d7e80e96b45d0c856bec337280f413d14d764689ba8de7389e16d74e97b20" exitCode=0 Apr 17 11:16:47.495122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.494728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerDied","Data":"8f0d7e80e96b45d0c856bec337280f413d14d764689ba8de7389e16d74e97b20"} Apr 17 11:16:47.786644 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.786552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:47.790479 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.790451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8adcdab8-9195-4dbb-838d-3ac5065f81ed-original-pull-secret\") pod \"global-pull-secret-syncer-hddsl\" (UID: \"8adcdab8-9195-4dbb-838d-3ac5065f81ed\") " pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:47.887410 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.887373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:47.887581 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.887440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:47.887581 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.887558 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:47.887581 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.887580 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:16:47.887734 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.887644 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.887624547 +0000 UTC m=+41.057431608 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:16:47.887734 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.887557 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:47.887734 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.887716 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.887696835 +0000 UTC m=+41.057503881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:16:47.988264 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:47.988227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:47.988476 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.988417 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:47.988545 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:47.988507 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:51.988488003 +0000 UTC m=+41.158295068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:16:48.048452 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:48.048369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hddsl" Apr 17 11:16:48.382273 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:48.382117 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hddsl"] Apr 17 11:16:48.386489 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:16:48.386450 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8adcdab8_9195_4dbb_838d_3ac5065f81ed.slice/crio-366278638b46fa0d02b6b8040deb00b9a73eb60ace75d6c0d172009e11c9c091 WatchSource:0}: Error finding container 366278638b46fa0d02b6b8040deb00b9a73eb60ace75d6c0d172009e11c9c091: Status 404 returned error can't find the container with id 366278638b46fa0d02b6b8040deb00b9a73eb60ace75d6c0d172009e11c9c091 Apr 17 11:16:48.500141 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:48.500110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" event={"ID":"b1d77920-3c64-40cf-82ce-24b1244a48e0","Type":"ContainerStarted","Data":"8e9d29dadbd59c0bb4b28157c1e6ce761dae55cb6f754654cb9fd94e70c848fe"} Apr 17 11:16:48.501102 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:48.501080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hddsl" event={"ID":"8adcdab8-9195-4dbb-838d-3ac5065f81ed","Type":"ContainerStarted","Data":"366278638b46fa0d02b6b8040deb00b9a73eb60ace75d6c0d172009e11c9c091"} Apr 17 11:16:48.524737 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:48.524601 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4w8lp" podStartSLOduration=6.227135748 podStartE2EDuration="37.524586532s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:14.024975737 +0000 UTC m=+3.194782780" lastFinishedPulling="2026-04-17 11:16:45.322426521 +0000 UTC m=+34.492233564" observedRunningTime="2026-04-17 11:16:48.52331888 +0000 UTC m=+37.693125945" watchObservedRunningTime="2026-04-17 11:16:48.524586532 +0000 UTC m=+37.694393796" Apr 17 11:16:49.504197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:49.504158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-cvt8g" event={"ID":"055f933b-358b-4058-aa0d-4808293e4549","Type":"ContainerStarted","Data":"abfe9aeea70f7a73357096b120ebc351fde83f9211b539e1790d02b04ebe9b60"} Apr 17 11:16:49.504966 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:49.504636 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:16:49.523829 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:49.523776 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-cvt8g" podStartSLOduration=35.45370036 podStartE2EDuration="38.523758556s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:16:45.500194324 +0000 UTC m=+34.670001368" lastFinishedPulling="2026-04-17 11:16:48.570252521 +0000 UTC m=+37.740059564" observedRunningTime="2026-04-17 11:16:49.523468651 +0000 UTC m=+38.693275716" watchObservedRunningTime="2026-04-17 11:16:49.523758556 +0000 UTC m=+38.693565621" Apr 17 11:16:51.920033 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:51.919983 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:51.920497 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:51.920103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:51.920497 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:51.920167 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:51.920497 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:51.920192 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:16:51.920497 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:51.920227 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:51.920497 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:51.920264 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:59.920248507 +0000 UTC m=+49.090055554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:16:51.920497 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:51.920282 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:16:59.920273889 +0000 UTC m=+49.090080934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:16:52.021326 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:52.021288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:16:52.021510 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:52.021445 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:16:52.021510 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:52.021504 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:00.021487339 +0000 UTC m=+49.191294390 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:16:53.513108 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:53.513071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hddsl" event={"ID":"8adcdab8-9195-4dbb-838d-3ac5065f81ed","Type":"ContainerStarted","Data":"b677643e7b15fb34a56d71b29446903946f88e6726563035103010991a23b775"} Apr 17 11:16:53.528602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:53.528555 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hddsl" podStartSLOduration=18.523389306 podStartE2EDuration="22.528538656s" podCreationTimestamp="2026-04-17 11:16:31 +0000 UTC" firstStartedPulling="2026-04-17 11:16:48.388854948 +0000 UTC m=+37.558661998" lastFinishedPulling="2026-04-17 11:16:52.394004298 +0000 UTC m=+41.563811348" observedRunningTime="2026-04-17 11:16:53.528300065 +0000 UTC m=+42.698107131" watchObservedRunningTime="2026-04-17 11:16:53.528538656 +0000 UTC m=+42.698345726" Apr 17 11:16:59.986481 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:59.986445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:16:59.986837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:16:59.986496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:16:59.986837 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:59.986593 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:16:59.986837 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:59.986603 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:16:59.986837 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:59.986617 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:16:59.986837 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:59.986655 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:15.986641655 +0000 UTC m=+65.156448698 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:16:59.986837 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:16:59.986692 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:15.986671819 +0000 UTC m=+65.156478881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:17:00.087000 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:00.086961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:17:00.087178 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:00.087113 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:00.087227 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:00.087182 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:16.087163618 +0000 UTC m=+65.256970662 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:17:09.489933 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:09.489904 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bvdth" Apr 17 11:17:15.998877 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:15.998830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:17:15.999274 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:15.998917 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:17:15.999274 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:15.998987 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:17:15.999274 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:15.999000 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:15.999274 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:15.999007 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:17:15.999274 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:15.999061 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:47.999046869 +0000 UTC m=+97.168853913 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:17:15.999274 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:15.999074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:47.999068109 +0000 UTC m=+97.168875152 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:17:16.099813 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:16.099766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:17:16.099973 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:16.099881 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:16.099973 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:16.099954 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:17:48.099930865 +0000 UTC m=+97.269737907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:17:17.006185 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:17.006143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:17:17.008573 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:17.008554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 11:17:17.016613 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:17.016592 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:17:17.016702 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:17.016670 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:21.016652887 +0000 UTC m=+130.186459931 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : secret "metrics-daemon-secret" not found Apr 17 11:17:21.510352 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:21.510317 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-cvt8g" Apr 17 11:17:48.041351 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:48.041279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:17:48.041351 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:48.041366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:17:48.041768 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.041473 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:17:48.041768 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.041497 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:17:48.041768 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.041505 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:17:48.041768 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.041557 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.041541191 +0000 UTC m=+161.211348250 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:17:48.041768 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.041571 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.041565267 +0000 UTC m=+161.211372310 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:17:48.142186 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:17:48.142088 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:17:48.142316 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.142225 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:17:48.142316 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:17:48.142284 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:52.14226719 +0000 UTC m=+161.312074233 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:18:21.080078 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:21.080021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:18:21.080599 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:21.080195 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 11:18:21.080599 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:21.080292 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs podName:d129dd20-5a5b-4718-8eca-2f10184defe9 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:23.080275523 +0000 UTC m=+252.250082566 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs") pod "network-metrics-daemon-tvn9d" (UID: "d129dd20-5a5b-4718-8eca-2f10184defe9") : secret "metrics-daemon-secret" not found Apr 17 11:18:42.943779 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.943737 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8"] Apr 17 11:18:42.946583 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.946561 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k76l5"] Apr 17 11:18:42.946756 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.946733 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:42.949321 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.949300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 11:18:42.949468 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.949320 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 11:18:42.949468 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.949319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 11:18:42.949468 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.949373 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:42.950532 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.950508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 11:18:42.950630 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.950532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-q4jkh\"" Apr 17 11:18:42.951367 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.951330 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 11:18:42.951456 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.951375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 11:18:42.951456 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.951388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-9kvfz\"" Apr 17 11:18:42.951734 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.951720 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 11:18:42.952087 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.952064 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 11:18:42.956452 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.956434 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 11:18:42.959890 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.959867 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8"] Apr 17 11:18:42.960914 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:42.960890 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k76l5"] Apr 17 11:18:43.030910 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.030873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357b4ca1-1680-47a2-96cf-40460312708f-serving-cert\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.030910 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.030916 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94c7d\" (UniqueName: \"kubernetes.io/projected/ba013853-d529-46b6-84d3-0e259d87af73-kube-api-access-94c7d\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.031106 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.030935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/357b4ca1-1680-47a2-96cf-40460312708f-snapshots\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.031106 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.030952 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357b4ca1-1680-47a2-96cf-40460312708f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.031106 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.030982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.031106 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.031009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba013853-d529-46b6-84d3-0e259d87af73-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.031106 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.031033 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkx9\" (UniqueName: \"kubernetes.io/projected/357b4ca1-1680-47a2-96cf-40460312708f-kube-api-access-tvkx9\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.031106 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.031089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/357b4ca1-1680-47a2-96cf-40460312708f-tmp\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.031286 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.031110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357b4ca1-1680-47a2-96cf-40460312708f-service-ca-bundle\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.051931 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.051893 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8"] Apr 17 11:18:43.054835 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.054817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" Apr 17 11:18:43.057101 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.057056 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-qgtmp\"" Apr 17 11:18:43.057216 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.057136 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.057291 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.057271 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.067509 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.067485 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8"] Apr 17 11:18:43.132306 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94c7d\" (UniqueName: \"kubernetes.io/projected/ba013853-d529-46b6-84d3-0e259d87af73-kube-api-access-94c7d\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.132306 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/357b4ca1-1680-47a2-96cf-40460312708f-snapshots\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.132580 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357b4ca1-1680-47a2-96cf-40460312708f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.132580 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.132580 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba013853-d529-46b6-84d3-0e259d87af73-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.132580 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkx9\" (UniqueName: \"kubernetes.io/projected/357b4ca1-1680-47a2-96cf-40460312708f-kube-api-access-tvkx9\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.132580 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/357b4ca1-1680-47a2-96cf-40460312708f-tmp\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.132580 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:43.132505 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:43.132844 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357b4ca1-1680-47a2-96cf-40460312708f-service-ca-bundle\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.132844 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:43.132604 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls podName:ba013853-d529-46b6-84d3-0e259d87af73 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:43.632582549 +0000 UTC m=+152.802389607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mrs8" (UID: "ba013853-d529-46b6-84d3-0e259d87af73") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:43.132973 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357b4ca1-1680-47a2-96cf-40460312708f-serving-cert\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.132973 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.132958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/357b4ca1-1680-47a2-96cf-40460312708f-tmp\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.133132 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.133003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq5gm\" (UniqueName: \"kubernetes.io/projected/a2090436-524c-47c4-ac0f-8b94ceec083d-kube-api-access-wq5gm\") pod \"volume-data-source-validator-7c6cbb6c87-5m8z8\" (UID: \"a2090436-524c-47c4-ac0f-8b94ceec083d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" Apr 17 11:18:43.133132 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.133036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/357b4ca1-1680-47a2-96cf-40460312708f-snapshots\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.133203 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.133176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ba013853-d529-46b6-84d3-0e259d87af73-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.133275 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.133254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357b4ca1-1680-47a2-96cf-40460312708f-service-ca-bundle\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.133548 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.133524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/357b4ca1-1680-47a2-96cf-40460312708f-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.136355 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.136303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357b4ca1-1680-47a2-96cf-40460312708f-serving-cert\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.146321 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.146290 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94c7d\" (UniqueName: \"kubernetes.io/projected/ba013853-d529-46b6-84d3-0e259d87af73-kube-api-access-94c7d\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.146321 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.146303 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkx9\" (UniqueName: \"kubernetes.io/projected/357b4ca1-1680-47a2-96cf-40460312708f-kube-api-access-tvkx9\") pod \"insights-operator-585dfdc468-k76l5\" (UID: \"357b4ca1-1680-47a2-96cf-40460312708f\") " pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.177049 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.177017 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gcrf7"] Apr 17 11:18:43.181123 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.181105 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.205882 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.205805 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.206419 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.206402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 11:18:43.206687 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.206672 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.234001 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.233966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5k8v\" (UniqueName: \"kubernetes.io/projected/3932e45a-3ab6-40aa-8c2b-48214229c367-kube-api-access-q5k8v\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.234079 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.234025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3932e45a-3ab6-40aa-8c2b-48214229c367-trusted-ca\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.234145 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.234122 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wq5gm\" (UniqueName: \"kubernetes.io/projected/a2090436-524c-47c4-ac0f-8b94ceec083d-kube-api-access-wq5gm\") pod \"volume-data-source-validator-7c6cbb6c87-5m8z8\" (UID: \"a2090436-524c-47c4-ac0f-8b94ceec083d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" Apr 17 11:18:43.234192 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.234177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3932e45a-3ab6-40aa-8c2b-48214229c367-serving-cert\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.234279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.234265 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3932e45a-3ab6-40aa-8c2b-48214229c367-config\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.247731 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.247702 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 11:18:43.248063 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.248048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-lt8b2\"" Apr 17 11:18:43.261161 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.261137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gcrf7"] Apr 17 11:18:43.263304 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.263276 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-k76l5" Apr 17 11:18:43.283698 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.283669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 11:18:43.295654 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.295625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq5gm\" (UniqueName: \"kubernetes.io/projected/a2090436-524c-47c4-ac0f-8b94ceec083d-kube-api-access-wq5gm\") pod \"volume-data-source-validator-7c6cbb6c87-5m8z8\" (UID: \"a2090436-524c-47c4-ac0f-8b94ceec083d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" Apr 17 11:18:43.309920 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.309889 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8"] Apr 17 11:18:43.314803 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.314752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.317808 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.317783 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 11:18:43.318310 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.318166 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.318310 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.318199 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.318729 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.318637 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7hg8h\"" Apr 17 11:18:43.319305 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.319289 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 11:18:43.326959 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.326887 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8"] Apr 17 11:18:43.335438 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.335438 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3932e45a-3ab6-40aa-8c2b-48214229c367-serving-cert\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.335632 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3932e45a-3ab6-40aa-8c2b-48214229c367-config\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.335632 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335514 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q5k8v\" (UniqueName: \"kubernetes.io/projected/3932e45a-3ab6-40aa-8c2b-48214229c367-kube-api-access-q5k8v\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.335632 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335548 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.335632 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bxk\" (UniqueName: \"kubernetes.io/projected/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-kube-api-access-c8bxk\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.335632 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.335615 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3932e45a-3ab6-40aa-8c2b-48214229c367-trusted-ca\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.336495 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.336474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3932e45a-3ab6-40aa-8c2b-48214229c367-config\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.336767 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.336749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3932e45a-3ab6-40aa-8c2b-48214229c367-trusted-ca\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.337873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.337858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3932e45a-3ab6-40aa-8c2b-48214229c367-serving-cert\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.351194 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.351159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5k8v\" (UniqueName: \"kubernetes.io/projected/3932e45a-3ab6-40aa-8c2b-48214229c367-kube-api-access-q5k8v\") pod \"console-operator-9d4b6777b-gcrf7\" (UID: \"3932e45a-3ab6-40aa-8c2b-48214229c367\") " pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.363124 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.362952 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58"] Apr 17 11:18:43.364038 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.364012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" Apr 17 11:18:43.366410 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.366380 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb"] Apr 17 11:18:43.366619 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.366437 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" Apr 17 11:18:43.368590 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.368563 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-d59jk\"" Apr 17 11:18:43.369596 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.369576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.373054 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.373031 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 11:18:43.373157 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.373114 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-tp9sf\"" Apr 17 11:18:43.373226 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.373175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 11:18:43.373226 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.373031 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 11:18:43.373521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.373443 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 11:18:43.376322 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.376303 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58"] Apr 17 11:18:43.379270 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.379240 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb"] Apr 17 11:18:43.397115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.397090 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-k76l5"] Apr 17 11:18:43.399819 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:43.399790 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357b4ca1_1680_47a2_96cf_40460312708f.slice/crio-ade1689a8bceb8ed330953ce1fdb9462db4af72861d9720d81c9e693fba820f7 WatchSource:0}: Error finding container ade1689a8bceb8ed330953ce1fdb9462db4af72861d9720d81c9e693fba820f7: Status 404 returned error can't find the container with id ade1689a8bceb8ed330953ce1fdb9462db4af72861d9720d81c9e693fba820f7 Apr 17 11:18:43.436470 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.436629 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.436629 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.436768 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bxk\" (UniqueName: \"kubernetes.io/projected/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-kube-api-access-c8bxk\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.436768 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-config\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.436870 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmgnp\" (UniqueName: \"kubernetes.io/projected/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-kube-api-access-wmgnp\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.436870 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.436803 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjdsf\" (UniqueName: \"kubernetes.io/projected/5de58c93-9df1-4e96-8236-6bcce80c59c7-kube-api-access-qjdsf\") pod \"network-check-source-8894fc9bd-vqr58\" (UID: \"5de58c93-9df1-4e96-8236-6bcce80c59c7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" Apr 17 11:18:43.437142 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.437119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.438690 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.438675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.447004 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.446973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bxk\" (UniqueName: \"kubernetes.io/projected/06179da3-f0cd-4bd1-8c19-8e6e7e41a7be-kube-api-access-c8bxk\") pod \"kube-storage-version-migrator-operator-6769c5d45-s55l8\" (UID: \"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.489629 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.489504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:43.490725 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.490702 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8"] Apr 17 11:18:43.494650 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:43.494624 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2090436_524c_47c4_ac0f_8b94ceec083d.slice/crio-b8cfd61b07a05466d4ed69dda5373498178863c48d92d2cc9651b759169195fd WatchSource:0}: Error finding container b8cfd61b07a05466d4ed69dda5373498178863c48d92d2cc9651b759169195fd: Status 404 returned error can't find the container with id b8cfd61b07a05466d4ed69dda5373498178863c48d92d2cc9651b759169195fd Apr 17 11:18:43.537533 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.537484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.537697 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.537583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-config\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.537697 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.537655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmgnp\" (UniqueName: \"kubernetes.io/projected/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-kube-api-access-wmgnp\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.537697 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.537684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjdsf\" (UniqueName: \"kubernetes.io/projected/5de58c93-9df1-4e96-8236-6bcce80c59c7-kube-api-access-qjdsf\") pod \"network-check-source-8894fc9bd-vqr58\" (UID: \"5de58c93-9df1-4e96-8236-6bcce80c59c7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" Apr 17 11:18:43.539023 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.538995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-config\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.540214 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.540194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.546549 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.546521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmgnp\" (UniqueName: \"kubernetes.io/projected/77ba66d5-aa2d-4111-aa6d-c1e26b8d296e-kube-api-access-wmgnp\") pod \"service-ca-operator-d6fc45fc5-2kdcb\" (UID: \"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.546675 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.546657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjdsf\" (UniqueName: \"kubernetes.io/projected/5de58c93-9df1-4e96-8236-6bcce80c59c7-kube-api-access-qjdsf\") pod \"network-check-source-8894fc9bd-vqr58\" (UID: \"5de58c93-9df1-4e96-8236-6bcce80c59c7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" Apr 17 11:18:43.617069 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.617037 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-gcrf7"] Apr 17 11:18:43.619936 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:43.619908 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3932e45a_3ab6_40aa_8c2b_48214229c367.slice/crio-7cfbb068a0d607987389d0775da932e7a877aaaed16111bb689ee5f5343f9003 WatchSource:0}: Error finding container 7cfbb068a0d607987389d0775da932e7a877aaaed16111bb689ee5f5343f9003: Status 404 returned error can't find the container with id 7cfbb068a0d607987389d0775da932e7a877aaaed16111bb689ee5f5343f9003 Apr 17 11:18:43.628128 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.628102 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" Apr 17 11:18:43.638763 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.638729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:43.638903 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:43.638882 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:43.638981 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:43.638969 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls podName:ba013853-d529-46b6-84d3-0e259d87af73 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:44.638947568 +0000 UTC m=+153.808754623 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mrs8" (UID: "ba013853-d529-46b6-84d3-0e259d87af73") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:43.677606 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.677559 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" Apr 17 11:18:43.683917 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.683884 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" Apr 17 11:18:43.730827 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.730792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" event={"ID":"a2090436-524c-47c4-ac0f-8b94ceec083d","Type":"ContainerStarted","Data":"b8cfd61b07a05466d4ed69dda5373498178863c48d92d2cc9651b759169195fd"} Apr 17 11:18:43.732094 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.732031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k76l5" event={"ID":"357b4ca1-1680-47a2-96cf-40460312708f","Type":"ContainerStarted","Data":"ade1689a8bceb8ed330953ce1fdb9462db4af72861d9720d81c9e693fba820f7"} Apr 17 11:18:43.733179 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.733125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" event={"ID":"3932e45a-3ab6-40aa-8c2b-48214229c367","Type":"ContainerStarted","Data":"7cfbb068a0d607987389d0775da932e7a877aaaed16111bb689ee5f5343f9003"} Apr 17 11:18:43.784868 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.784838 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8"] Apr 17 11:18:43.796797 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:43.796750 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06179da3_f0cd_4bd1_8c19_8e6e7e41a7be.slice/crio-bded5c32d10c9dc9b4808d2d3fbc14abe28ea658555c3062d80186f5d2adce65 WatchSource:0}: Error finding container bded5c32d10c9dc9b4808d2d3fbc14abe28ea658555c3062d80186f5d2adce65: Status 404 returned error can't find the container with id bded5c32d10c9dc9b4808d2d3fbc14abe28ea658555c3062d80186f5d2adce65 Apr 17 11:18:43.812027 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.810461 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58"] Apr 17 11:18:43.814844 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:43.814812 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5de58c93_9df1_4e96_8236_6bcce80c59c7.slice/crio-5b376f499d104070048511a6e8efde69a4cf9338d460fe48cdcfc58231758eed WatchSource:0}: Error finding container 5b376f499d104070048511a6e8efde69a4cf9338d460fe48cdcfc58231758eed: Status 404 returned error can't find the container with id 5b376f499d104070048511a6e8efde69a4cf9338d460fe48cdcfc58231758eed Apr 17 11:18:43.838613 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:43.838579 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb"] Apr 17 11:18:43.844964 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:43.844931 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77ba66d5_aa2d_4111_aa6d_c1e26b8d296e.slice/crio-121f08dc285f682e51af02f50b8c2351ae02149464573bfcf97347c444fc76e2 WatchSource:0}: Error finding container 121f08dc285f682e51af02f50b8c2351ae02149464573bfcf97347c444fc76e2: Status 404 returned error can't find the container with id 121f08dc285f682e51af02f50b8c2351ae02149464573bfcf97347c444fc76e2 Apr 17 11:18:44.649668 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:44.649077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:44.649668 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:44.649228 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:44.649668 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:44.649293 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls podName:ba013853-d529-46b6-84d3-0e259d87af73 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:46.649273373 +0000 UTC m=+155.819080419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mrs8" (UID: "ba013853-d529-46b6-84d3-0e259d87af73") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:44.739158 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:44.739099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" event={"ID":"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e","Type":"ContainerStarted","Data":"121f08dc285f682e51af02f50b8c2351ae02149464573bfcf97347c444fc76e2"} Apr 17 11:18:44.741757 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:44.740963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" event={"ID":"5de58c93-9df1-4e96-8236-6bcce80c59c7","Type":"ContainerStarted","Data":"180d3af67275c6aaa854bd22949042c276f95c64272acf03ba8bd5bc7428d7b6"} Apr 17 11:18:44.741757 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:44.741002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" event={"ID":"5de58c93-9df1-4e96-8236-6bcce80c59c7","Type":"ContainerStarted","Data":"5b376f499d104070048511a6e8efde69a4cf9338d460fe48cdcfc58231758eed"} Apr 17 11:18:44.743504 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:44.743452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" event={"ID":"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be","Type":"ContainerStarted","Data":"bded5c32d10c9dc9b4808d2d3fbc14abe28ea658555c3062d80186f5d2adce65"} Apr 17 11:18:46.667982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:46.667936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:46.668419 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:46.668109 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:46.668419 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:46.668196 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls podName:ba013853-d529-46b6-84d3-0e259d87af73 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:50.668178706 +0000 UTC m=+159.837985769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mrs8" (UID: "ba013853-d529-46b6-84d3-0e259d87af73") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:47.184902 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:47.184849 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" podUID="45ae37dd-21dd-4ab9-bf36-4f2523ed8410" Apr 17 11:18:47.197169 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:47.197123 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ngx5g" podUID="d4bb7b6c-7fd2-4a72-be8b-724128cbea39" Apr 17 11:18:47.215034 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:47.214982 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-7dssr" podUID="3d82faa5-4dae-416a-8e2c-337d3966cdd0" Apr 17 11:18:47.752192 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.752105 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/0.log" Apr 17 11:18:47.752192 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.752157 2575 generic.go:358] "Generic (PLEG): container finished" podID="3932e45a-3ab6-40aa-8c2b-48214229c367" containerID="c3ceb5ce8c1e94030a2adcb93eea63961b0909b22301b796d19b33e679fcf836" exitCode=255 Apr 17 11:18:47.752701 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.752228 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" event={"ID":"3932e45a-3ab6-40aa-8c2b-48214229c367","Type":"ContainerDied","Data":"c3ceb5ce8c1e94030a2adcb93eea63961b0909b22301b796d19b33e679fcf836"} Apr 17 11:18:47.752701 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.752536 2575 scope.go:117] "RemoveContainer" containerID="c3ceb5ce8c1e94030a2adcb93eea63961b0909b22301b796d19b33e679fcf836" Apr 17 11:18:47.753808 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.753763 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" event={"ID":"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e","Type":"ContainerStarted","Data":"ede125af2806987021cfe5d7e6e12ce812d974f05a7538cc12c73b88dc3ad7d8"} Apr 17 11:18:47.758283 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.758254 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" event={"ID":"a2090436-524c-47c4-ac0f-8b94ceec083d","Type":"ContainerStarted","Data":"37c3e487b5b99b0c3b27ff7df07c688b3e56f573b925541c6a79b2cd567b1aa1"} Apr 17 11:18:47.759801 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.759775 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" event={"ID":"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be","Type":"ContainerStarted","Data":"45d63c02c840fd246f0ecfc2a4d36b57bdab2a942a0a803513effeb289e8d48b"} Apr 17 11:18:47.761077 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.761036 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k76l5" event={"ID":"357b4ca1-1680-47a2-96cf-40460312708f","Type":"ContainerStarted","Data":"d6fb164d6a559927c0adfa6cad500f0ec67204e61c93a59d1817ede3b76b4826"} Apr 17 11:18:47.761169 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.761078 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ngx5g" Apr 17 11:18:47.761169 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.761126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:18:47.771331 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.771120 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vqr58" podStartSLOduration=4.771103692 podStartE2EDuration="4.771103692s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:44.763440231 +0000 UTC m=+153.933247290" watchObservedRunningTime="2026-04-17 11:18:47.771103692 +0000 UTC m=+156.940910756" Apr 17 11:18:47.789701 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.789585 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-k76l5" podStartSLOduration=1.861376514 podStartE2EDuration="5.78956254s" podCreationTimestamp="2026-04-17 11:18:42 +0000 UTC" firstStartedPulling="2026-04-17 11:18:43.40177651 +0000 UTC m=+152.571583554" lastFinishedPulling="2026-04-17 11:18:47.329962533 +0000 UTC m=+156.499769580" observedRunningTime="2026-04-17 11:18:47.787729654 +0000 UTC m=+156.957536718" watchObservedRunningTime="2026-04-17 11:18:47.78956254 +0000 UTC m=+156.959369607" Apr 17 11:18:47.809074 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.809021 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5m8z8" podStartSLOduration=0.975342114 podStartE2EDuration="4.809002783s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:43.496557088 +0000 UTC m=+152.666364133" lastFinishedPulling="2026-04-17 11:18:47.330217752 +0000 UTC m=+156.500024802" observedRunningTime="2026-04-17 11:18:47.808500271 +0000 UTC m=+156.978307336" watchObservedRunningTime="2026-04-17 11:18:47.809002783 +0000 UTC m=+156.978809852" Apr 17 11:18:47.827103 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:47.827037 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" podStartSLOduration=1.289047152 podStartE2EDuration="4.827015427s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:43.798687381 +0000 UTC m=+152.968494424" lastFinishedPulling="2026-04-17 11:18:47.336655653 +0000 UTC m=+156.506462699" observedRunningTime="2026-04-17 11:18:47.825911641 +0000 UTC m=+156.995718707" watchObservedRunningTime="2026-04-17 11:18:47.827015427 +0000 UTC m=+156.996822495" Apr 17 11:18:48.354582 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:48.354542 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tvn9d" podUID="d129dd20-5a5b-4718-8eca-2f10184defe9" Apr 17 11:18:48.765247 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.765161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/1.log" Apr 17 11:18:48.765680 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.765619 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/0.log" Apr 17 11:18:48.765680 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.765657 2575 generic.go:358] "Generic (PLEG): container finished" podID="3932e45a-3ab6-40aa-8c2b-48214229c367" containerID="f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6" exitCode=255 Apr 17 11:18:48.765783 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.765691 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" event={"ID":"3932e45a-3ab6-40aa-8c2b-48214229c367","Type":"ContainerDied","Data":"f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6"} Apr 17 11:18:48.765783 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.765740 2575 scope.go:117] "RemoveContainer" containerID="c3ceb5ce8c1e94030a2adcb93eea63961b0909b22301b796d19b33e679fcf836" Apr 17 11:18:48.766015 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.765996 2575 scope.go:117] "RemoveContainer" containerID="f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6" Apr 17 11:18:48.766226 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:48.766207 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gcrf7_openshift-console-operator(3932e45a-3ab6-40aa-8c2b-48214229c367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podUID="3932e45a-3ab6-40aa-8c2b-48214229c367" Apr 17 11:18:48.785567 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:48.785517 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" podStartSLOduration=2.3007379390000002 podStartE2EDuration="5.785501946s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:43.847195027 +0000 UTC m=+153.017002079" lastFinishedPulling="2026-04-17 11:18:47.331959042 +0000 UTC m=+156.501766086" observedRunningTime="2026-04-17 11:18:47.843305534 +0000 UTC m=+157.013112601" watchObservedRunningTime="2026-04-17 11:18:48.785501946 +0000 UTC m=+157.955309077" Apr 17 11:18:49.769381 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:49.769354 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/1.log" Apr 17 11:18:49.769762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:49.769702 2575 scope.go:117] "RemoveContainer" containerID="f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6" Apr 17 11:18:49.769899 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:49.769881 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gcrf7_openshift-console-operator(3932e45a-3ab6-40aa-8c2b-48214229c367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podUID="3932e45a-3ab6-40aa-8c2b-48214229c367" Apr 17 11:18:50.703835 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:50.703789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:50.704009 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:50.703947 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:50.704055 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:50.704018 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls podName:ba013853-d529-46b6-84d3-0e259d87af73 nodeName:}" failed. No retries permitted until 2026-04-17 11:18:58.704000542 +0000 UTC m=+167.873807586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mrs8" (UID: "ba013853-d529-46b6-84d3-0e259d87af73") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:51.170584 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.170523 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4r4xj_ec79fc9e-eb92-4d6d-9ea6-2a309575b035/dns-node-resolver/0.log" Apr 17 11:18:51.427569 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.427479 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vn4x5"] Apr 17 11:18:51.431775 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.431751 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.436278 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.436252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 11:18:51.436278 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.436286 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 11:18:51.437056 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.437032 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-dnvvf\"" Apr 17 11:18:51.437181 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.437076 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 11:18:51.437181 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.437093 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 11:18:51.443059 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.443037 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vn4x5"] Apr 17 11:18:51.511519 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.511489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/280b5b3c-3b2d-402c-82b7-fc685f29694f-signing-key\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.511694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.511549 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxgdp\" (UniqueName: \"kubernetes.io/projected/280b5b3c-3b2d-402c-82b7-fc685f29694f-kube-api-access-vxgdp\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.511694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.511595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/280b5b3c-3b2d-402c-82b7-fc685f29694f-signing-cabundle\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.612119 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.612084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/280b5b3c-3b2d-402c-82b7-fc685f29694f-signing-key\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.612315 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.612136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxgdp\" (UniqueName: \"kubernetes.io/projected/280b5b3c-3b2d-402c-82b7-fc685f29694f-kube-api-access-vxgdp\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.612422 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.612368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/280b5b3c-3b2d-402c-82b7-fc685f29694f-signing-cabundle\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.612948 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.612927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/280b5b3c-3b2d-402c-82b7-fc685f29694f-signing-cabundle\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.614566 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.614544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/280b5b3c-3b2d-402c-82b7-fc685f29694f-signing-key\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.620818 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.620789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxgdp\" (UniqueName: \"kubernetes.io/projected/280b5b3c-3b2d-402c-82b7-fc685f29694f-kube-api-access-vxgdp\") pod \"service-ca-865cb79987-vn4x5\" (UID: \"280b5b3c-3b2d-402c-82b7-fc685f29694f\") " pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.740630 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.740543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-vn4x5" Apr 17 11:18:51.859515 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:51.859482 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-vn4x5"] Apr 17 11:18:51.862700 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:18:51.862661 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280b5b3c_3b2d_402c_82b7_fc685f29694f.slice/crio-698af6763d273ee2ff08e7c1d2d96c36458caa781b6d61e640a4a67de92a4e6c WatchSource:0}: Error finding container 698af6763d273ee2ff08e7c1d2d96c36458caa781b6d61e640a4a67de92a4e6c: Status 404 returned error can't find the container with id 698af6763d273ee2ff08e7c1d2d96c36458caa781b6d61e640a4a67de92a4e6c Apr 17 11:18:52.117398 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.117364 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:18:52.117574 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.117413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") pod \"image-registry-78b7768865-zkfjk\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:18:52.117574 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.117518 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 11:18:52.117574 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.117559 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 11:18:52.117574 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.117570 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-78b7768865-zkfjk: secret "image-registry-tls" not found Apr 17 11:18:52.117702 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.117588 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls podName:d4bb7b6c-7fd2-4a72-be8b-724128cbea39 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:54.117570726 +0000 UTC m=+283.287377769 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls") pod "dns-default-ngx5g" (UID: "d4bb7b6c-7fd2-4a72-be8b-724128cbea39") : secret "dns-default-metrics-tls" not found Apr 17 11:18:52.117702 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.117608 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls podName:45ae37dd-21dd-4ab9-bf36-4f2523ed8410 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:54.117596623 +0000 UTC m=+283.287403666 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls") pod "image-registry-78b7768865-zkfjk" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410") : secret "image-registry-tls" not found Apr 17 11:18:52.218520 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.218487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:18:52.218883 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.218655 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 11:18:52.218883 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:52.218738 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert podName:3d82faa5-4dae-416a-8e2c-337d3966cdd0 nodeName:}" failed. No retries permitted until 2026-04-17 11:20:54.218714925 +0000 UTC m=+283.388521982 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert") pod "ingress-canary-7dssr" (UID: "3d82faa5-4dae-416a-8e2c-337d3966cdd0") : secret "canary-serving-cert" not found Apr 17 11:18:52.371854 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.371785 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pms2x_8bc04c84-5033-4522-bdb4-8ff714571072/node-ca/0.log" Apr 17 11:18:52.778279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.778196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vn4x5" event={"ID":"280b5b3c-3b2d-402c-82b7-fc685f29694f","Type":"ContainerStarted","Data":"aa199d37b309c97e46866205e63c9e5e25b0a07acee61031bba98b7b530ea438"} Apr 17 11:18:52.778279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.778235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-vn4x5" event={"ID":"280b5b3c-3b2d-402c-82b7-fc685f29694f","Type":"ContainerStarted","Data":"698af6763d273ee2ff08e7c1d2d96c36458caa781b6d61e640a4a67de92a4e6c"} Apr 17 11:18:52.800483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:52.800424 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-vn4x5" podStartSLOduration=1.800402682 podStartE2EDuration="1.800402682s" podCreationTimestamp="2026-04-17 11:18:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:18:52.799222679 +0000 UTC m=+161.969029743" watchObservedRunningTime="2026-04-17 11:18:52.800402682 +0000 UTC m=+161.970209749" Apr 17 11:18:53.490385 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:53.490320 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:53.490385 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:53.490391 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:18:53.490912 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:53.490878 2575 scope.go:117] "RemoveContainer" containerID="f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6" Apr 17 11:18:53.491117 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:53.491092 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-gcrf7_openshift-console-operator(3932e45a-3ab6-40aa-8c2b-48214229c367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podUID="3932e45a-3ab6-40aa-8c2b-48214229c367" Apr 17 11:18:53.974445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:53.974410 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s55l8_06179da3-f0cd-4bd1-8c19-8e6e7e41a7be/kube-storage-version-migrator-operator/0.log" Apr 17 11:18:58.337917 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:58.337877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:18:58.778024 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:18:58.777930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:18:58.778178 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:58.778098 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 11:18:58.778178 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:18:58.778170 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls podName:ba013853-d529-46b6-84d3-0e259d87af73 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:14.778152969 +0000 UTC m=+183.947960012 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-6mrs8" (UID: "ba013853-d529-46b6-84d3-0e259d87af73") : secret "cluster-monitoring-operator-tls" not found Apr 17 11:19:03.338365 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:03.338242 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:19:08.338026 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.337984 2575 scope.go:117] "RemoveContainer" containerID="f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6" Apr 17 11:19:08.825618 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.825592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:19:08.825944 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.825931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/1.log" Apr 17 11:19:08.825993 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.825965 2575 generic.go:358] "Generic (PLEG): container finished" podID="3932e45a-3ab6-40aa-8c2b-48214229c367" containerID="7dfce0070821503ab68d25ea47d7b9a2a5aa498fb0f4c52375f04d9c0effb5e9" exitCode=255 Apr 17 11:19:08.826034 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.826018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" event={"ID":"3932e45a-3ab6-40aa-8c2b-48214229c367","Type":"ContainerDied","Data":"7dfce0070821503ab68d25ea47d7b9a2a5aa498fb0f4c52375f04d9c0effb5e9"} Apr 17 11:19:08.826074 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.826060 2575 scope.go:117] "RemoveContainer" containerID="f82acdb9670fe6f9e4bbbe05d5dc7398e29b451e2376a8a8971212aedd2d5bf6" Apr 17 11:19:08.826416 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:08.826396 2575 scope.go:117] "RemoveContainer" containerID="7dfce0070821503ab68d25ea47d7b9a2a5aa498fb0f4c52375f04d9c0effb5e9" Apr 17 11:19:08.826635 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:08.826614 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gcrf7_openshift-console-operator(3932e45a-3ab6-40aa-8c2b-48214229c367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podUID="3932e45a-3ab6-40aa-8c2b-48214229c367" Apr 17 11:19:09.830530 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:09.830497 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:19:10.603856 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.603820 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8gmpm"] Apr 17 11:19:10.607249 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.607230 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.610032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.610006 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-5zvgq\"" Apr 17 11:19:10.610554 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.610521 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 11:19:10.610664 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.610564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 11:19:10.622620 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.622588 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8gmpm"] Apr 17 11:19:10.781848 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.781810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f9114e81-c2b0-41e4-9e6c-72b4f1198507-data-volume\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.782018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.781872 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5h5l\" (UniqueName: \"kubernetes.io/projected/f9114e81-c2b0-41e4-9e6c-72b4f1198507-kube-api-access-b5h5l\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.782018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.781897 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f9114e81-c2b0-41e4-9e6c-72b4f1198507-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.782018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.781923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f9114e81-c2b0-41e4-9e6c-72b4f1198507-crio-socket\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.782018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.781953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f9114e81-c2b0-41e4-9e6c-72b4f1198507-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883361 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5h5l\" (UniqueName: \"kubernetes.io/projected/f9114e81-c2b0-41e4-9e6c-72b4f1198507-kube-api-access-b5h5l\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883361 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f9114e81-c2b0-41e4-9e6c-72b4f1198507-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883824 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f9114e81-c2b0-41e4-9e6c-72b4f1198507-crio-socket\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883824 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f9114e81-c2b0-41e4-9e6c-72b4f1198507-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883824 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f9114e81-c2b0-41e4-9e6c-72b4f1198507-data-volume\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883824 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f9114e81-c2b0-41e4-9e6c-72b4f1198507-crio-socket\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883996 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883863 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f9114e81-c2b0-41e4-9e6c-72b4f1198507-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.883996 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.883958 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f9114e81-c2b0-41e4-9e6c-72b4f1198507-data-volume\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.886011 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.885991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f9114e81-c2b0-41e4-9e6c-72b4f1198507-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.890750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.890720 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5h5l\" (UniqueName: \"kubernetes.io/projected/f9114e81-c2b0-41e4-9e6c-72b4f1198507-kube-api-access-b5h5l\") pod \"insights-runtime-extractor-8gmpm\" (UID: \"f9114e81-c2b0-41e4-9e6c-72b4f1198507\") " pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:10.916070 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:10.916040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8gmpm" Apr 17 11:19:11.035115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:11.035082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8gmpm"] Apr 17 11:19:11.037921 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:11.037892 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9114e81_c2b0_41e4_9e6c_72b4f1198507.slice/crio-e4a4f45a09341c34b7f9aaa78ddabadbbaa3e133a83b3c7c58e220b4c2e3437f WatchSource:0}: Error finding container e4a4f45a09341c34b7f9aaa78ddabadbbaa3e133a83b3c7c58e220b4c2e3437f: Status 404 returned error can't find the container with id e4a4f45a09341c34b7f9aaa78ddabadbbaa3e133a83b3c7c58e220b4c2e3437f Apr 17 11:19:11.838529 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:11.838489 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gmpm" event={"ID":"f9114e81-c2b0-41e4-9e6c-72b4f1198507","Type":"ContainerStarted","Data":"3b35249e8c55e47922b8eabb1b52bb28e4978733a3fc6c905d1cafd2d5a1b27f"} Apr 17 11:19:11.838529 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:11.838532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gmpm" event={"ID":"f9114e81-c2b0-41e4-9e6c-72b4f1198507","Type":"ContainerStarted","Data":"473d1f84edc72ba25e9fd6c4dc2b95734819575a9ae29b5c4dfb15e3f44f6d2e"} Apr 17 11:19:11.838728 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:11.838543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gmpm" event={"ID":"f9114e81-c2b0-41e4-9e6c-72b4f1198507","Type":"ContainerStarted","Data":"e4a4f45a09341c34b7f9aaa78ddabadbbaa3e133a83b3c7c58e220b4c2e3437f"} Apr 17 11:19:13.490650 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:13.490604 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:19:13.490650 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:13.490656 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:19:13.491076 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:13.491033 2575 scope.go:117] "RemoveContainer" containerID="7dfce0070821503ab68d25ea47d7b9a2a5aa498fb0f4c52375f04d9c0effb5e9" Apr 17 11:19:13.491223 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:13.491203 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gcrf7_openshift-console-operator(3932e45a-3ab6-40aa-8c2b-48214229c367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podUID="3932e45a-3ab6-40aa-8c2b-48214229c367" Apr 17 11:19:13.847916 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:13.847881 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8gmpm" event={"ID":"f9114e81-c2b0-41e4-9e6c-72b4f1198507","Type":"ContainerStarted","Data":"c84901ac51aa9d93bff5080bff9aae050bc23ad2ae6bfe753aea612819ed6adc"} Apr 17 11:19:13.867941 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:13.867889 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8gmpm" podStartSLOduration=2.043844456 podStartE2EDuration="3.867874179s" podCreationTimestamp="2026-04-17 11:19:10 +0000 UTC" firstStartedPulling="2026-04-17 11:19:11.098378834 +0000 UTC m=+180.268185876" lastFinishedPulling="2026-04-17 11:19:12.922408552 +0000 UTC m=+182.092215599" observedRunningTime="2026-04-17 11:19:13.866972004 +0000 UTC m=+183.036779063" watchObservedRunningTime="2026-04-17 11:19:13.867874179 +0000 UTC m=+183.037681305" Apr 17 11:19:14.820242 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:14.820189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:19:14.822635 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:14.822616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ba013853-d529-46b6-84d3-0e259d87af73-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-6mrs8\" (UID: \"ba013853-d529-46b6-84d3-0e259d87af73\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:19:15.060087 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:15.060056 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-q4jkh\"" Apr 17 11:19:15.067939 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:15.067912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" Apr 17 11:19:15.189109 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:15.189079 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8"] Apr 17 11:19:15.191762 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:15.191737 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba013853_d529_46b6_84d3_0e259d87af73.slice/crio-9913a4b014f7acfa460a027626616876bed4b01ee7c7423a32faed7bba85b580 WatchSource:0}: Error finding container 9913a4b014f7acfa460a027626616876bed4b01ee7c7423a32faed7bba85b580: Status 404 returned error can't find the container with id 9913a4b014f7acfa460a027626616876bed4b01ee7c7423a32faed7bba85b580 Apr 17 11:19:15.854812 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:15.854777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" event={"ID":"ba013853-d529-46b6-84d3-0e259d87af73","Type":"ContainerStarted","Data":"9913a4b014f7acfa460a027626616876bed4b01ee7c7423a32faed7bba85b580"} Apr 17 11:19:16.858755 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:16.858713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" event={"ID":"ba013853-d529-46b6-84d3-0e259d87af73","Type":"ContainerStarted","Data":"7e5c2b5c61eb460841e46ad337d52762f60415bdc3f61ea675309e9f48e4ccec"} Apr 17 11:19:16.877277 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:16.877220 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-6mrs8" podStartSLOduration=33.373493945 podStartE2EDuration="34.877204135s" podCreationTimestamp="2026-04-17 11:18:42 +0000 UTC" firstStartedPulling="2026-04-17 11:19:15.193615298 +0000 UTC m=+184.363422345" lastFinishedPulling="2026-04-17 11:19:16.697325479 +0000 UTC m=+185.867132535" observedRunningTime="2026-04-17 11:19:16.876472846 +0000 UTC m=+186.046279912" watchObservedRunningTime="2026-04-17 11:19:16.877204135 +0000 UTC m=+186.047011200" Apr 17 11:19:17.346840 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.346803 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr"] Apr 17 11:19:17.349986 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.349968 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:17.352054 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.352031 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-5lhrp\"" Apr 17 11:19:17.352472 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.352453 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 17 11:19:17.358584 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.358555 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr"] Apr 17 11:19:17.543357 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.543304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4e57b4d6-c19a-42b9-a145-d67b092f51aa-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-n6kjr\" (UID: \"4e57b4d6-c19a-42b9-a145-d67b092f51aa\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:17.644217 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.644133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4e57b4d6-c19a-42b9-a145-d67b092f51aa-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-n6kjr\" (UID: \"4e57b4d6-c19a-42b9-a145-d67b092f51aa\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:17.646771 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.646748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/4e57b4d6-c19a-42b9-a145-d67b092f51aa-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-n6kjr\" (UID: \"4e57b4d6-c19a-42b9-a145-d67b092f51aa\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:17.659764 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.659730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:17.775767 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.775724 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr"] Apr 17 11:19:17.779997 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:17.779953 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e57b4d6_c19a_42b9_a145_d67b092f51aa.slice/crio-25f447d714b11a0e09792c51e1506722b405d3c4445bdecf6dbb22c608b35486 WatchSource:0}: Error finding container 25f447d714b11a0e09792c51e1506722b405d3c4445bdecf6dbb22c608b35486: Status 404 returned error can't find the container with id 25f447d714b11a0e09792c51e1506722b405d3c4445bdecf6dbb22c608b35486 Apr 17 11:19:17.861557 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:17.861516 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" event={"ID":"4e57b4d6-c19a-42b9-a145-d67b092f51aa","Type":"ContainerStarted","Data":"25f447d714b11a0e09792c51e1506722b405d3c4445bdecf6dbb22c608b35486"} Apr 17 11:19:18.866202 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:18.866161 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" event={"ID":"4e57b4d6-c19a-42b9-a145-d67b092f51aa","Type":"ContainerStarted","Data":"6e85537ab361a05940f7097e02344112cd8d4edb8108593f6dee03cf222bad87"} Apr 17 11:19:18.866669 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:18.866368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:18.867645 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:18.867609 2575 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-57cf98b594-n6kjr container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.132.0.18:8443/healthz\": dial tcp 10.132.0.18:8443: connect: connection refused" start-of-body= Apr 17 11:19:18.867759 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:18.867665 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" podUID="4e57b4d6-c19a-42b9-a145-d67b092f51aa" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.132.0.18:8443/healthz\": dial tcp 10.132.0.18:8443: connect: connection refused" Apr 17 11:19:18.880853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:18.880807 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" podStartSLOduration=0.880619472 podStartE2EDuration="1.880794322s" podCreationTimestamp="2026-04-17 11:19:17 +0000 UTC" firstStartedPulling="2026-04-17 11:19:17.781894224 +0000 UTC m=+186.951701268" lastFinishedPulling="2026-04-17 11:19:18.782069071 +0000 UTC m=+187.951876118" observedRunningTime="2026-04-17 11:19:18.879938369 +0000 UTC m=+188.049745435" watchObservedRunningTime="2026-04-17 11:19:18.880794322 +0000 UTC m=+188.050601387" Apr 17 11:19:19.873294 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:19.873263 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-n6kjr" Apr 17 11:19:24.836451 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.836412 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2"] Apr 17 11:19:24.841025 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.841004 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:24.844425 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.844394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 11:19:24.844688 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.844668 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 11:19:24.845263 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.845244 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-457f4\"" Apr 17 11:19:24.847543 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.847516 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:19:24.863906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.863872 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2"] Apr 17 11:19:24.865553 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.865521 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pdgxw"] Apr 17 11:19:24.869181 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.869163 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.872094 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.872066 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 11:19:24.872220 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.872102 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 11:19:24.872220 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.872111 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2grgd\"" Apr 17 11:19:24.872303 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.872270 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 11:19:24.901407 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39dab659-2408-4839-a217-e44018fe6d68-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:24.901407 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-wtmp\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.901643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-tls\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.901643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-root\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.901643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901582 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-accelerators-collector-config\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.901643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39dab659-2408-4839-a217-e44018fe6d68-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:24.901837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4nn6\" (UniqueName: \"kubernetes.io/projected/35d31261-af18-42f2-8ad4-20563e06ef13-kube-api-access-t4nn6\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.901837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39dab659-2408-4839-a217-e44018fe6d68-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:24.901837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftlh\" (UniqueName: \"kubernetes.io/projected/39dab659-2408-4839-a217-e44018fe6d68-kube-api-access-rftlh\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:24.901837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901776 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-sys\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.901837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.902075 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901876 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d31261-af18-42f2-8ad4-20563e06ef13-metrics-client-ca\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.902075 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.901955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-textfile\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:24.911691 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.911661 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jlcpd"] Apr 17 11:19:24.915381 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.915363 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:24.918428 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.918399 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 11:19:24.918428 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.918402 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 11:19:24.918596 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.918440 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 11:19:24.919543 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.919524 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ks494\"" Apr 17 11:19:24.929828 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:24.929803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jlcpd"] Apr 17 11:19:25.002254 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-textfile\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002254 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39dab659-2408-4839-a217-e44018fe6d68-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-wtmp\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002297 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-tls\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:25.002406 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/687c2dc1-3d57-43d9-ad53-91df7e933033-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:25.002479 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-tls podName:35d31261-af18-42f2-8ad4-20563e06ef13 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:25.502459043 +0000 UTC m=+194.672266108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-tls") pod "node-exporter-pdgxw" (UID: "35d31261-af18-42f2-8ad4-20563e06ef13") : secret "node-exporter-tls" not found Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-wtmp\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-root\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002550 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-accelerators-collector-config\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002567 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39dab659-2408-4839-a217-e44018fe6d68-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4nn6\" (UniqueName: \"kubernetes.io/projected/35d31261-af18-42f2-8ad4-20563e06ef13-kube-api-access-t4nn6\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-root\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39dab659-2408-4839-a217-e44018fe6d68-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-textfile\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rftlh\" (UniqueName: \"kubernetes.io/projected/39dab659-2408-4839-a217-e44018fe6d68-kube-api-access-rftlh\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-sys\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.002927 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/687c2dc1-3d57-43d9-ad53-91df7e933033-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/35d31261-af18-42f2-8ad4-20563e06ef13-sys\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.002993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003037 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gvjf\" (UniqueName: \"kubernetes.io/projected/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-api-access-2gvjf\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d31261-af18-42f2-8ad4-20563e06ef13-metrics-client-ca\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003111 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-accelerators-collector-config\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.003163 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.003633 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003611 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/35d31261-af18-42f2-8ad4-20563e06ef13-metrics-client-ca\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.003942 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.003919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39dab659-2408-4839-a217-e44018fe6d68-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.005685 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.005656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.005685 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.005673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39dab659-2408-4839-a217-e44018fe6d68-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.005847 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.005699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39dab659-2408-4839-a217-e44018fe6d68-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.021509 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.021470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftlh\" (UniqueName: \"kubernetes.io/projected/39dab659-2408-4839-a217-e44018fe6d68-kube-api-access-rftlh\") pod \"openshift-state-metrics-9d44df66c-gl5p2\" (UID: \"39dab659-2408-4839-a217-e44018fe6d68\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.021730 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.021707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4nn6\" (UniqueName: \"kubernetes.io/projected/35d31261-af18-42f2-8ad4-20563e06ef13-kube-api-access-t4nn6\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.103986 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.103884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/687c2dc1-3d57-43d9-ad53-91df7e933033-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.103986 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.103943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104211 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gvjf\" (UniqueName: \"kubernetes.io/projected/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-api-access-2gvjf\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104211 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104323 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104323 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/687c2dc1-3d57-43d9-ad53-91df7e933033-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104518 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:25.104499 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 11:19:25.104606 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:25.104576 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-tls podName:687c2dc1-3d57-43d9-ad53-91df7e933033 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:25.604556208 +0000 UTC m=+194.774363271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-jlcpd" (UID: "687c2dc1-3d57-43d9-ad53-91df7e933033") : secret "kube-state-metrics-tls" not found Apr 17 11:19:25.104723 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104707 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/687c2dc1-3d57-43d9-ad53-91df7e933033-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104790 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/687c2dc1-3d57-43d9-ad53-91df7e933033-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.104984 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.104960 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.107306 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.107282 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.112873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.112845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gvjf\" (UniqueName: \"kubernetes.io/projected/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-api-access-2gvjf\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.152904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.152865 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" Apr 17 11:19:25.283535 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.283500 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2"] Apr 17 11:19:25.286511 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:25.286484 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39dab659_2408_4839_a217_e44018fe6d68.slice/crio-d6aa8aca4cda3d7ac013adc034f74a6c5ceabf03c9506c61ef28cafec3413748 WatchSource:0}: Error finding container d6aa8aca4cda3d7ac013adc034f74a6c5ceabf03c9506c61ef28cafec3413748: Status 404 returned error can't find the container with id d6aa8aca4cda3d7ac013adc034f74a6c5ceabf03c9506c61ef28cafec3413748 Apr 17 11:19:25.508635 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.508594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-tls\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.511034 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.511011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/35d31261-af18-42f2-8ad4-20563e06ef13-node-exporter-tls\") pod \"node-exporter-pdgxw\" (UID: \"35d31261-af18-42f2-8ad4-20563e06ef13\") " pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.609629 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.609552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.611916 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.611894 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/687c2dc1-3d57-43d9-ad53-91df7e933033-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-jlcpd\" (UID: \"687c2dc1-3d57-43d9-ad53-91df7e933033\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.778666 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.778633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pdgxw" Apr 17 11:19:25.788506 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:25.788475 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d31261_af18_42f2_8ad4_20563e06ef13.slice/crio-430803159db93b46a2324eed1c966b404a09893d7cda7a1c10d9ff3de5dc4cf6 WatchSource:0}: Error finding container 430803159db93b46a2324eed1c966b404a09893d7cda7a1c10d9ff3de5dc4cf6: Status 404 returned error can't find the container with id 430803159db93b46a2324eed1c966b404a09893d7cda7a1c10d9ff3de5dc4cf6 Apr 17 11:19:25.826772 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.826736 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" Apr 17 11:19:25.874483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.874394 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:19:25.881315 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.881287 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884124 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884357 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884456 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zgsh8\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884628 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884640 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884792 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884835 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884913 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.885011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:19:25.886212 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.884796 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:19:25.895523 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.891810 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:19:25.896701 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.896655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pdgxw" event={"ID":"35d31261-af18-42f2-8ad4-20563e06ef13","Type":"ContainerStarted","Data":"430803159db93b46a2324eed1c966b404a09893d7cda7a1c10d9ff3de5dc4cf6"} Apr 17 11:19:25.899794 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.899654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" event={"ID":"39dab659-2408-4839-a217-e44018fe6d68","Type":"ContainerStarted","Data":"a33866b9d3cd7a9d5ae92f9d0d39eb5084731a6250cda2362cb07f1a1b093038"} Apr 17 11:19:25.899794 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.899693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" event={"ID":"39dab659-2408-4839-a217-e44018fe6d68","Type":"ContainerStarted","Data":"53fdcb155db686f94700ef1bea9aa16610a6c03d4bf20f7ab2e7846ec3539852"} Apr 17 11:19:25.899794 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.899707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" event={"ID":"39dab659-2408-4839-a217-e44018fe6d68","Type":"ContainerStarted","Data":"d6aa8aca4cda3d7ac013adc034f74a6c5ceabf03c9506c61ef28cafec3413748"} Apr 17 11:19:25.912508 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912652 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912652 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912540 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbdnt\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-kube-api-access-lbdnt\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912652 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912793 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912793 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912736 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912872 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912872 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912872 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-web-config\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912975 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912975 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.912975 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.912953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-out\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.913063 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.913019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:25.969614 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:25.969571 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-jlcpd"] Apr 17 11:19:25.972838 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:25.972801 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod687c2dc1_3d57_43d9_ad53_91df7e933033.slice/crio-a7a610006829d0e86bfbe5ef06f2004c8993d3175a7337201dde42d51778c687 WatchSource:0}: Error finding container a7a610006829d0e86bfbe5ef06f2004c8993d3175a7337201dde42d51778c687: Status 404 returned error can't find the container with id a7a610006829d0e86bfbe5ef06f2004c8993d3175a7337201dde42d51778c687 Apr 17 11:19:26.014255 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014255 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014255 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-out\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014512 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014613 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014577 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014670 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014616 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014670 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbdnt\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-kube-api-access-lbdnt\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014774 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014774 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014721 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014880 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014880 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014880 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014850 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.014880 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.014875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-web-config\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.016309 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.015074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.016309 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:26.015568 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle podName:8e61ab94-a317-4cf2-92cb-d72cff701f92 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:26.515544524 +0000 UTC m=+195.685351569 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92") : configmap references non-existent config key: ca-bundle.crt Apr 17 11:19:26.016309 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.015825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.016309 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:26.015931 2575 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 17 11:19:26.016309 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:26.016006 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls podName:8e61ab94-a317-4cf2-92cb-d72cff701f92 nodeName:}" failed. No retries permitted until 2026-04-17 11:19:26.515980622 +0000 UTC m=+195.685787676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92") : secret "alertmanager-main-tls" not found Apr 17 11:19:26.018158 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.018116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-web-config\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.019640 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.019615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.019733 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.019648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.020283 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.020257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.020979 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.020917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-out\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.021092 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.021060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.022447 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.022420 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.023298 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.023276 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.024420 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.024370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbdnt\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-kube-api-access-lbdnt\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.520858 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.520811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.521066 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.521035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.522016 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.521965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.523839 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.523816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.799881 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.799801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:19:26.905209 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.905165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" event={"ID":"687c2dc1-3d57-43d9-ad53-91df7e933033","Type":"ContainerStarted","Data":"a7a610006829d0e86bfbe5ef06f2004c8993d3175a7337201dde42d51778c687"} Apr 17 11:19:26.907537 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.907307 2575 generic.go:358] "Generic (PLEG): container finished" podID="35d31261-af18-42f2-8ad4-20563e06ef13" containerID="1be27347054b8d47b31b2a6cca9df2eeb105003fa58bc00bf52adf92dbf3e3cc" exitCode=0 Apr 17 11:19:26.907878 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.907477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pdgxw" event={"ID":"35d31261-af18-42f2-8ad4-20563e06ef13","Type":"ContainerDied","Data":"1be27347054b8d47b31b2a6cca9df2eeb105003fa58bc00bf52adf92dbf3e3cc"} Apr 17 11:19:26.911485 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.911440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" event={"ID":"39dab659-2408-4839-a217-e44018fe6d68","Type":"ContainerStarted","Data":"c6a4485bdafbdd0df829e4b5e030a968f86c2ee53cf69786d7e6facfe03d0167"} Apr 17 11:19:26.949221 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.949173 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gl5p2" podStartSLOduration=1.701276747 podStartE2EDuration="2.949157729s" podCreationTimestamp="2026-04-17 11:19:24 +0000 UTC" firstStartedPulling="2026-04-17 11:19:25.410102713 +0000 UTC m=+194.579909760" lastFinishedPulling="2026-04-17 11:19:26.657983696 +0000 UTC m=+195.827790742" observedRunningTime="2026-04-17 11:19:26.94795631 +0000 UTC m=+196.117763374" watchObservedRunningTime="2026-04-17 11:19:26.949157729 +0000 UTC m=+196.118964793" Apr 17 11:19:26.965049 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:26.964574 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:19:27.263779 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:27.263750 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e61ab94_a317_4cf2_92cb_d72cff701f92.slice/crio-66274d7dfcf4947d19a10f15de703e11c22daf81bfa3f30819ef139960c54a7a WatchSource:0}: Error finding container 66274d7dfcf4947d19a10f15de703e11c22daf81bfa3f30819ef139960c54a7a: Status 404 returned error can't find the container with id 66274d7dfcf4947d19a10f15de703e11c22daf81bfa3f30819ef139960c54a7a Apr 17 11:19:27.916409 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.916373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"66274d7dfcf4947d19a10f15de703e11c22daf81bfa3f30819ef139960c54a7a"} Apr 17 11:19:27.920221 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.920157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pdgxw" event={"ID":"35d31261-af18-42f2-8ad4-20563e06ef13","Type":"ContainerStarted","Data":"42a8e0ab6ad9472e7bdfe75aa4453fdb589dcc5f7a4bb58846284eb9933e90dc"} Apr 17 11:19:27.920221 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.920197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pdgxw" event={"ID":"35d31261-af18-42f2-8ad4-20563e06ef13","Type":"ContainerStarted","Data":"c0c4222fd6167072408e4358174285be5485545c7180e2f15f958e8182d29887"} Apr 17 11:19:27.922694 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.922661 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" event={"ID":"687c2dc1-3d57-43d9-ad53-91df7e933033","Type":"ContainerStarted","Data":"eb0a6199f36e0251a607cf8fb0b01f05592a1c40eec73bad6179d712f06413af"} Apr 17 11:19:27.922826 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.922700 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" event={"ID":"687c2dc1-3d57-43d9-ad53-91df7e933033","Type":"ContainerStarted","Data":"6d748c3191c534b1f00b13807abfc794cc3ba793086f873885c10fa89821962e"} Apr 17 11:19:27.922826 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.922714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" event={"ID":"687c2dc1-3d57-43d9-ad53-91df7e933033","Type":"ContainerStarted","Data":"849d7131929c95f634c7558d4653d6f75f26047a947aab2af4b8121fbd39ad65"} Apr 17 11:19:27.946024 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.945961 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pdgxw" podStartSLOduration=3.076288621 podStartE2EDuration="3.945941198s" podCreationTimestamp="2026-04-17 11:19:24 +0000 UTC" firstStartedPulling="2026-04-17 11:19:25.790534065 +0000 UTC m=+194.960341111" lastFinishedPulling="2026-04-17 11:19:26.660186641 +0000 UTC m=+195.829993688" observedRunningTime="2026-04-17 11:19:27.94382026 +0000 UTC m=+197.113627338" watchObservedRunningTime="2026-04-17 11:19:27.945941198 +0000 UTC m=+197.115748263" Apr 17 11:19:27.966061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:27.965999 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-jlcpd" podStartSLOduration=2.630994937 podStartE2EDuration="3.965982786s" podCreationTimestamp="2026-04-17 11:19:24 +0000 UTC" firstStartedPulling="2026-04-17 11:19:25.975293426 +0000 UTC m=+195.145100473" lastFinishedPulling="2026-04-17 11:19:27.310281277 +0000 UTC m=+196.480088322" observedRunningTime="2026-04-17 11:19:27.964238536 +0000 UTC m=+197.134045642" watchObservedRunningTime="2026-04-17 11:19:27.965982786 +0000 UTC m=+197.135789850" Apr 17 11:19:28.338819 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:28.338793 2575 scope.go:117] "RemoveContainer" containerID="7dfce0070821503ab68d25ea47d7b9a2a5aa498fb0f4c52375f04d9c0effb5e9" Apr 17 11:19:28.339007 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:28.338988 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-gcrf7_openshift-console-operator(3932e45a-3ab6-40aa-8c2b-48214229c367)\"" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podUID="3932e45a-3ab6-40aa-8c2b-48214229c367" Apr 17 11:19:28.927411 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:28.927333 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a" exitCode=0 Apr 17 11:19:28.927882 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:28.927446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a"} Apr 17 11:19:30.938008 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:30.937973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9"} Apr 17 11:19:30.938008 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:30.938017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9"} Apr 17 11:19:30.938542 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:30.938029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55"} Apr 17 11:19:30.938542 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:30.938039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44"} Apr 17 11:19:30.938542 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:30.938049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9"} Apr 17 11:19:31.943606 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:31.943571 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerStarted","Data":"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66"} Apr 17 11:19:31.971794 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:31.971742 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.11700693 podStartE2EDuration="6.971729014s" podCreationTimestamp="2026-04-17 11:19:25 +0000 UTC" firstStartedPulling="2026-04-17 11:19:27.265687202 +0000 UTC m=+196.435494249" lastFinishedPulling="2026-04-17 11:19:31.120409284 +0000 UTC m=+200.290216333" observedRunningTime="2026-04-17 11:19:31.969520955 +0000 UTC m=+201.139328017" watchObservedRunningTime="2026-04-17 11:19:31.971729014 +0000 UTC m=+201.141536078" Apr 17 11:19:33.076980 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.076945 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-78b7768865-zkfjk"] Apr 17 11:19:33.077396 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:19:33.077190 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" podUID="45ae37dd-21dd-4ab9-bf36-4f2523ed8410" Apr 17 11:19:33.949293 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.949262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:19:33.953542 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.953521 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:19:33.995486 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995449 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-image-registry-private-configuration\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995596 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995514 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-certificates\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995596 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995535 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-ca-trust-extracted\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995685 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995654 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-trusted-ca\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995746 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995727 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-installation-pull-secrets\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995804 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995775 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztbn5\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-kube-api-access-ztbn5\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995804 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995773 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:19:33.995906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995831 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-bound-sa-token\") pod \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\" (UID: \"45ae37dd-21dd-4ab9-bf36-4f2523ed8410\") " Apr 17 11:19:33.995906 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.995885 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:33.996053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.996030 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:19:33.996161 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.996146 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-ca-trust-extracted\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:33.996211 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.996169 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-trusted-ca\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:33.996211 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.996185 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-certificates\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:33.997993 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.997968 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:33.998081 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.998024 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-kube-api-access-ztbn5" (OuterVolumeSpecName: "kube-api-access-ztbn5") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "kube-api-access-ztbn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:19:33.998178 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.998098 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:33.998215 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:33.998181 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "45ae37dd-21dd-4ab9-bf36-4f2523ed8410" (UID: "45ae37dd-21dd-4ab9-bf36-4f2523ed8410"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:19:34.096952 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.096915 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-installation-pull-secrets\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:34.096952 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.096948 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ztbn5\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-kube-api-access-ztbn5\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:34.096952 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.096958 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-bound-sa-token\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:34.097408 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.096973 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-image-registry-private-configuration\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:34.952470 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.952441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-78b7768865-zkfjk" Apr 17 11:19:34.985581 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.985549 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-78b7768865-zkfjk"] Apr 17 11:19:34.989214 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:34.989185 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-78b7768865-zkfjk"] Apr 17 11:19:35.108112 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:35.108079 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/45ae37dd-21dd-4ab9-bf36-4f2523ed8410-registry-tls\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:19:35.341945 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:35.341916 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ae37dd-21dd-4ab9-bf36-4f2523ed8410" path="/var/lib/kubelet/pods/45ae37dd-21dd-4ab9-bf36-4f2523ed8410/volumes" Apr 17 11:19:43.339052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.339021 2575 scope.go:117] "RemoveContainer" containerID="7dfce0070821503ab68d25ea47d7b9a2a5aa498fb0f4c52375f04d9c0effb5e9" Apr 17 11:19:43.848304 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.848270 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-vdnzk"] Apr 17 11:19:43.854643 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.854619 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:19:43.856737 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.856712 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5p9zj\"" Apr 17 11:19:43.856737 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.856728 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 11:19:43.856943 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.856781 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 11:19:43.863414 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.863387 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-vdnzk"] Apr 17 11:19:43.978873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.978846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:19:43.979052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.978912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" event={"ID":"3932e45a-3ab6-40aa-8c2b-48214229c367","Type":"ContainerStarted","Data":"547156cdc481b2db4b71a9f809f66be865ca781097be1b07b1e170b3fe48b021"} Apr 17 11:19:43.979210 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.979191 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:19:43.979276 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.979219 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhgwb\" (UniqueName: \"kubernetes.io/projected/b3744711-cf48-4517-ae8f-f049ff2343f8-kube-api-access-zhgwb\") pod \"downloads-6bcc868b7-vdnzk\" (UID: \"b3744711-cf48-4517-ae8f-f049ff2343f8\") " pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:19:43.984010 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.983984 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" Apr 17 11:19:43.998018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:43.997971 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-gcrf7" podStartSLOduration=57.285528048 podStartE2EDuration="1m0.997952559s" podCreationTimestamp="2026-04-17 11:18:43 +0000 UTC" firstStartedPulling="2026-04-17 11:18:43.621671045 +0000 UTC m=+152.791478088" lastFinishedPulling="2026-04-17 11:18:47.334095553 +0000 UTC m=+156.503902599" observedRunningTime="2026-04-17 11:19:43.997954 +0000 UTC m=+213.167761087" watchObservedRunningTime="2026-04-17 11:19:43.997952559 +0000 UTC m=+213.167759625" Apr 17 11:19:44.080374 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:44.080318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhgwb\" (UniqueName: \"kubernetes.io/projected/b3744711-cf48-4517-ae8f-f049ff2343f8-kube-api-access-zhgwb\") pod \"downloads-6bcc868b7-vdnzk\" (UID: \"b3744711-cf48-4517-ae8f-f049ff2343f8\") " pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:19:44.088826 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:44.088798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhgwb\" (UniqueName: \"kubernetes.io/projected/b3744711-cf48-4517-ae8f-f049ff2343f8-kube-api-access-zhgwb\") pod \"downloads-6bcc868b7-vdnzk\" (UID: \"b3744711-cf48-4517-ae8f-f049ff2343f8\") " pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:19:44.164311 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:44.164220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:19:44.285869 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:44.285845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-vdnzk"] Apr 17 11:19:44.287508 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:44.287481 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3744711_cf48_4517_ae8f_f049ff2343f8.slice/crio-6b0d63cda8c8b0ffa84c0d5ac0f5e2e0bde19c939ac3a797b22538d219d6d373 WatchSource:0}: Error finding container 6b0d63cda8c8b0ffa84c0d5ac0f5e2e0bde19c939ac3a797b22538d219d6d373: Status 404 returned error can't find the container with id 6b0d63cda8c8b0ffa84c0d5ac0f5e2e0bde19c939ac3a797b22538d219d6d373 Apr 17 11:19:44.983179 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:44.983140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-vdnzk" event={"ID":"b3744711-cf48-4517-ae8f-f049ff2343f8","Type":"ContainerStarted","Data":"6b0d63cda8c8b0ffa84c0d5ac0f5e2e0bde19c939ac3a797b22538d219d6d373"} Apr 17 11:19:48.779443 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.779394 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86b69cd8b5-lrgbb"] Apr 17 11:19:48.783610 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.783586 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:48.785869 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.785841 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sxh2p\"" Apr 17 11:19:48.786276 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.786177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 11:19:48.786837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.786813 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 11:19:48.786837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.786826 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 11:19:48.787002 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.786885 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 11:19:48.787115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.787101 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 11:19:48.795993 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.795968 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86b69cd8b5-lrgbb"] Apr 17 11:19:48.924395 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.924352 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-serving-cert\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:48.924593 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.924437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5mk4\" (UniqueName: \"kubernetes.io/projected/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-kube-api-access-p5mk4\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:48.924593 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.924474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-oauth-serving-cert\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:48.924593 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.924504 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-service-ca\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:48.924728 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.924633 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-oauth-config\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:48.924728 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:48.924663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-config\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.025773 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.025731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-oauth-config\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.025773 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.025766 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-config\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026019 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.025821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-serving-cert\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026019 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.025864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5mk4\" (UniqueName: \"kubernetes.io/projected/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-kube-api-access-p5mk4\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026019 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.025901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-oauth-serving-cert\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026019 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.025935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-service-ca\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026748 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.026695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-config\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026895 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.026797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-oauth-serving-cert\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.026895 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.026834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-service-ca\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.028691 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.028667 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-oauth-config\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.028883 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.028862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-serving-cert\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.034425 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.034365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5mk4\" (UniqueName: \"kubernetes.io/projected/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-kube-api-access-p5mk4\") pod \"console-86b69cd8b5-lrgbb\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.096437 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.096401 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:49.248207 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:49.248165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86b69cd8b5-lrgbb"] Apr 17 11:19:49.251074 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:19:49.251031 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7c04e64_e68a_4f6e_b9d3_081eb60fe820.slice/crio-3e5a16547c7201c2ec222dd9992a2fafc2dde5b49945fb671b88f187f0574a94 WatchSource:0}: Error finding container 3e5a16547c7201c2ec222dd9992a2fafc2dde5b49945fb671b88f187f0574a94: Status 404 returned error can't find the container with id 3e5a16547c7201c2ec222dd9992a2fafc2dde5b49945fb671b88f187f0574a94 Apr 17 11:19:50.000718 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:50.000676 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86b69cd8b5-lrgbb" event={"ID":"f7c04e64-e68a-4f6e-b9d3-081eb60fe820","Type":"ContainerStarted","Data":"3e5a16547c7201c2ec222dd9992a2fafc2dde5b49945fb671b88f187f0574a94"} Apr 17 11:19:53.012405 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:53.012322 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86b69cd8b5-lrgbb" event={"ID":"f7c04e64-e68a-4f6e-b9d3-081eb60fe820","Type":"ContainerStarted","Data":"86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72"} Apr 17 11:19:53.035455 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:53.035405 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86b69cd8b5-lrgbb" podStartSLOduration=1.9685353810000001 podStartE2EDuration="5.03538744s" podCreationTimestamp="2026-04-17 11:19:48 +0000 UTC" firstStartedPulling="2026-04-17 11:19:49.253407517 +0000 UTC m=+218.423214560" lastFinishedPulling="2026-04-17 11:19:52.320259565 +0000 UTC m=+221.490066619" observedRunningTime="2026-04-17 11:19:53.034827341 +0000 UTC m=+222.204634406" watchObservedRunningTime="2026-04-17 11:19:53.03538744 +0000 UTC m=+222.205194503" Apr 17 11:19:58.164067 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.164029 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f5979589c-56kth"] Apr 17 11:19:58.168675 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.168649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.181127 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.181086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5979589c-56kth"] Apr 17 11:19:58.185778 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.185748 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 11:19:58.318857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.318809 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-trusted-ca-bundle\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.319048 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.319020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-oauth-config\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.319118 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.319061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvljp\" (UniqueName: \"kubernetes.io/projected/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-kube-api-access-qvljp\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.319175 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.319111 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-service-ca\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.319175 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.319147 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-oauth-serving-cert\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.319279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.319174 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-serving-cert\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.319279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.319206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-config\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.419999 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.419912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-oauth-config\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.419999 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.419958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvljp\" (UniqueName: \"kubernetes.io/projected/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-kube-api-access-qvljp\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.419999 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.419981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-service-ca\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420265 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-oauth-serving-cert\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420265 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-serving-cert\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420265 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-config\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420463 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-trusted-ca-bundle\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-service-ca\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-config\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.420982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.420978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-oauth-serving-cert\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.421420 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.421400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-trusted-ca-bundle\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.422790 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.422766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-oauth-config\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.423023 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.423005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-serving-cert\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.434584 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.434553 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvljp\" (UniqueName: \"kubernetes.io/projected/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-kube-api-access-qvljp\") pod \"console-7f5979589c-56kth\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:58.484381 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:58.484248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:19:59.096579 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:59.096535 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:59.096579 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:59.096588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:19:59.102288 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:19:59.102258 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:20:00.039116 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:00.039083 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:20:00.886083 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:00.886053 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f5979589c-56kth"] Apr 17 11:20:00.889484 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:00.889454 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc12f1ae0_5ee4_49a4_a938_141db0c1ddbb.slice/crio-a6f0cee65872d5c52822ebfa54f7c15cf1d68a6b6709d1206bdfe51a4ae9228f WatchSource:0}: Error finding container a6f0cee65872d5c52822ebfa54f7c15cf1d68a6b6709d1206bdfe51a4ae9228f: Status 404 returned error can't find the container with id a6f0cee65872d5c52822ebfa54f7c15cf1d68a6b6709d1206bdfe51a4ae9228f Apr 17 11:20:01.038400 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.038288 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5979589c-56kth" event={"ID":"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb","Type":"ContainerStarted","Data":"626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3"} Apr 17 11:20:01.038400 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.038332 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5979589c-56kth" event={"ID":"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb","Type":"ContainerStarted","Data":"a6f0cee65872d5c52822ebfa54f7c15cf1d68a6b6709d1206bdfe51a4ae9228f"} Apr 17 11:20:01.039645 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.039615 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-vdnzk" event={"ID":"b3744711-cf48-4517-ae8f-f049ff2343f8","Type":"ContainerStarted","Data":"cbf12a5da5b600bc246e0036f9bed6ef0756b703c40dff7fc5707eb987c6cc95"} Apr 17 11:20:01.039989 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.039890 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:20:01.041169 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.041141 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-vdnzk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.132.0.22:8080/\": dial tcp 10.132.0.22:8080: connect: connection refused" start-of-body= Apr 17 11:20:01.041285 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.041194 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-vdnzk" podUID="b3744711-cf48-4517-ae8f-f049ff2343f8" containerName="download-server" probeResult="failure" output="Get \"http://10.132.0.22:8080/\": dial tcp 10.132.0.22:8080: connect: connection refused" Apr 17 11:20:01.057182 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.057127 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f5979589c-56kth" podStartSLOduration=3.057106342 podStartE2EDuration="3.057106342s" podCreationTimestamp="2026-04-17 11:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:01.055176083 +0000 UTC m=+230.224983150" watchObservedRunningTime="2026-04-17 11:20:01.057106342 +0000 UTC m=+230.226913405" Apr 17 11:20:01.072413 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:01.072333 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-vdnzk" podStartSLOduration=1.490874143 podStartE2EDuration="18.072313806s" podCreationTimestamp="2026-04-17 11:19:43 +0000 UTC" firstStartedPulling="2026-04-17 11:19:44.289285945 +0000 UTC m=+213.459092990" lastFinishedPulling="2026-04-17 11:20:00.870725594 +0000 UTC m=+230.040532653" observedRunningTime="2026-04-17 11:20:01.071650027 +0000 UTC m=+230.241457093" watchObservedRunningTime="2026-04-17 11:20:01.072313806 +0000 UTC m=+230.242120869" Apr 17 11:20:02.062629 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:02.062593 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-vdnzk" Apr 17 11:20:03.048392 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:03.048282 2575 generic.go:358] "Generic (PLEG): container finished" podID="06179da3-f0cd-4bd1-8c19-8e6e7e41a7be" containerID="45d63c02c840fd246f0ecfc2a4d36b57bdab2a942a0a803513effeb289e8d48b" exitCode=0 Apr 17 11:20:03.048392 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:03.048376 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" event={"ID":"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be","Type":"ContainerDied","Data":"45d63c02c840fd246f0ecfc2a4d36b57bdab2a942a0a803513effeb289e8d48b"} Apr 17 11:20:03.048812 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:03.048787 2575 scope.go:117] "RemoveContainer" containerID="45d63c02c840fd246f0ecfc2a4d36b57bdab2a942a0a803513effeb289e8d48b" Apr 17 11:20:03.049956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:03.049930 2575 generic.go:358] "Generic (PLEG): container finished" podID="357b4ca1-1680-47a2-96cf-40460312708f" containerID="d6fb164d6a559927c0adfa6cad500f0ec67204e61c93a59d1817ede3b76b4826" exitCode=0 Apr 17 11:20:03.050048 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:03.050020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k76l5" event={"ID":"357b4ca1-1680-47a2-96cf-40460312708f","Type":"ContainerDied","Data":"d6fb164d6a559927c0adfa6cad500f0ec67204e61c93a59d1817ede3b76b4826"} Apr 17 11:20:03.050458 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:03.050439 2575 scope.go:117] "RemoveContainer" containerID="d6fb164d6a559927c0adfa6cad500f0ec67204e61c93a59d1817ede3b76b4826" Apr 17 11:20:04.032976 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.032947 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/init-config-reloader/0.log" Apr 17 11:20:04.039835 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.039790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/alertmanager/0.log" Apr 17 11:20:04.056333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.056302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-s55l8" event={"ID":"06179da3-f0cd-4bd1-8c19-8e6e7e41a7be","Type":"ContainerStarted","Data":"c3d70863910d3845c085116f0180767f4eefd6d581d6af450717077075b959a7"} Apr 17 11:20:04.058435 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.058395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-k76l5" event={"ID":"357b4ca1-1680-47a2-96cf-40460312708f","Type":"ContainerStarted","Data":"4091683a8140c96d1397a299d666c45e5b60afe88792d8cf2c2914ccb87b4373"} Apr 17 11:20:04.185215 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.185182 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/config-reloader/0.log" Apr 17 11:20:04.381505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.381475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/kube-rbac-proxy-web/0.log" Apr 17 11:20:04.583572 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.583543 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/kube-rbac-proxy/0.log" Apr 17 11:20:04.780825 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.780749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/kube-rbac-proxy-metric/0.log" Apr 17 11:20:04.981774 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:04.981749 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8e61ab94-a317-4cf2-92cb-d72cff701f92/prom-label-proxy/0.log" Apr 17 11:20:05.183030 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:05.182978 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6mrs8_ba013853-d529-46b6-84d3-0e259d87af73/cluster-monitoring-operator/0.log" Apr 17 11:20:05.382502 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:05.382443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jlcpd_687c2dc1-3d57-43d9-ad53-91df7e933033/kube-state-metrics/0.log" Apr 17 11:20:05.580174 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:05.580138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jlcpd_687c2dc1-3d57-43d9-ad53-91df7e933033/kube-rbac-proxy-main/0.log" Apr 17 11:20:05.780231 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:05.780198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jlcpd_687c2dc1-3d57-43d9-ad53-91df7e933033/kube-rbac-proxy-self/0.log" Apr 17 11:20:07.580459 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:07.580429 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pdgxw_35d31261-af18-42f2-8ad4-20563e06ef13/init-textfile/0.log" Apr 17 11:20:07.785844 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:07.785797 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pdgxw_35d31261-af18-42f2-8ad4-20563e06ef13/node-exporter/0.log" Apr 17 11:20:07.980430 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:07.980356 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pdgxw_35d31261-af18-42f2-8ad4-20563e06ef13/kube-rbac-proxy/0.log" Apr 17 11:20:08.194641 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:08.194608 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gl5p2_39dab659-2408-4839-a217-e44018fe6d68/kube-rbac-proxy-main/0.log" Apr 17 11:20:08.381089 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:08.381058 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gl5p2_39dab659-2408-4839-a217-e44018fe6d68/kube-rbac-proxy-self/0.log" Apr 17 11:20:08.484857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:08.484815 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:20:08.485061 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:08.484905 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:20:08.490708 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:08.490678 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:20:08.581308 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:08.581279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gl5p2_39dab659-2408-4839-a217-e44018fe6d68/openshift-state-metrics/0.log" Apr 17 11:20:09.077607 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:09.077573 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:20:09.126705 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:09.126673 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86b69cd8b5-lrgbb"] Apr 17 11:20:10.583262 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:10.583231 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-n6kjr_4e57b4d6-c19a-42b9-a145-d67b092f51aa/prometheus-operator-admission-webhook/0.log" Apr 17 11:20:12.182715 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:12.182678 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:20:12.383175 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:12.383118 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/3.log" Apr 17 11:20:12.601817 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:12.601790 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5979589c-56kth_c12f1ae0-5ee4-49a4-a938-141db0c1ddbb/console/0.log" Apr 17 11:20:12.781492 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:12.781464 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86b69cd8b5-lrgbb_f7c04e64-e68a-4f6e-b9d3-081eb60fe820/console/0.log" Apr 17 11:20:12.992952 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:12.992878 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-vdnzk_b3744711-cf48-4517-ae8f-f049ff2343f8/download-server/0.log" Apr 17 11:20:14.092827 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:14.092788 2575 generic.go:358] "Generic (PLEG): container finished" podID="77ba66d5-aa2d-4111-aa6d-c1e26b8d296e" containerID="ede125af2806987021cfe5d7e6e12ce812d974f05a7538cc12c73b88dc3ad7d8" exitCode=0 Apr 17 11:20:14.093297 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:14.092851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" event={"ID":"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e","Type":"ContainerDied","Data":"ede125af2806987021cfe5d7e6e12ce812d974f05a7538cc12c73b88dc3ad7d8"} Apr 17 11:20:14.093297 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:14.093172 2575 scope.go:117] "RemoveContainer" containerID="ede125af2806987021cfe5d7e6e12ce812d974f05a7538cc12c73b88dc3ad7d8" Apr 17 11:20:15.097691 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:15.097656 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2kdcb" event={"ID":"77ba66d5-aa2d-4111-aa6d-c1e26b8d296e","Type":"ContainerStarted","Data":"56b836e00779b70ccf5106444dccf72413b3cb0cddd3b5d75dcee07f858f311b"} Apr 17 11:20:23.158736 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:23.158678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:20:23.161018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:23.160994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d129dd20-5a5b-4718-8eca-2f10184defe9-metrics-certs\") pod \"network-metrics-daemon-tvn9d\" (UID: \"d129dd20-5a5b-4718-8eca-2f10184defe9\") " pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:20:23.441553 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:23.441473 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ntppc\"" Apr 17 11:20:23.449431 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:23.449408 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tvn9d" Apr 17 11:20:23.570973 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:23.570808 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tvn9d"] Apr 17 11:20:23.573158 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:23.573130 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd129dd20_5a5b_4718_8eca_2f10184defe9.slice/crio-aaa65719582b018a86cdcccbabe7cc9aab77232c523ea6917d2e806eb5030375 WatchSource:0}: Error finding container aaa65719582b018a86cdcccbabe7cc9aab77232c523ea6917d2e806eb5030375: Status 404 returned error can't find the container with id aaa65719582b018a86cdcccbabe7cc9aab77232c523ea6917d2e806eb5030375 Apr 17 11:20:24.126688 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:24.126632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tvn9d" event={"ID":"d129dd20-5a5b-4718-8eca-2f10184defe9","Type":"ContainerStarted","Data":"aaa65719582b018a86cdcccbabe7cc9aab77232c523ea6917d2e806eb5030375"} Apr 17 11:20:25.130837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:25.130804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tvn9d" event={"ID":"d129dd20-5a5b-4718-8eca-2f10184defe9","Type":"ContainerStarted","Data":"87e7449ec5044b6a634a7905d3cc1c5484e89dfef4bf038dceeae770b6246448"} Apr 17 11:20:25.130837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:25.130840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tvn9d" event={"ID":"d129dd20-5a5b-4718-8eca-2f10184defe9","Type":"ContainerStarted","Data":"0bb256fcc7ee3067f9fdd70130e22b1c036eb63b10be06929b921d8566d1e2f8"} Apr 17 11:20:25.164073 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:25.164022 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tvn9d" podStartSLOduration=253.194170653 podStartE2EDuration="4m14.16400644s" podCreationTimestamp="2026-04-17 11:16:11 +0000 UTC" firstStartedPulling="2026-04-17 11:20:23.575121433 +0000 UTC m=+252.744928479" lastFinishedPulling="2026-04-17 11:20:24.544957209 +0000 UTC m=+253.714764266" observedRunningTime="2026-04-17 11:20:25.163439163 +0000 UTC m=+254.333246228" watchObservedRunningTime="2026-04-17 11:20:25.16400644 +0000 UTC m=+254.333813505" Apr 17 11:20:34.150152 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.150022 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-86b69cd8b5-lrgbb" podUID="f7c04e64-e68a-4f6e-b9d3-081eb60fe820" containerName="console" containerID="cri-o://86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72" gracePeriod=15 Apr 17 11:20:34.397885 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.397858 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86b69cd8b5-lrgbb_f7c04e64-e68a-4f6e-b9d3-081eb60fe820/console/0.log" Apr 17 11:20:34.398038 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.397929 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:20:34.559900 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.559815 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-oauth-serving-cert\") pod \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " Apr 17 11:20:34.559900 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.559877 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-service-ca\") pod \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " Apr 17 11:20:34.560115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.559916 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-config\") pod \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " Apr 17 11:20:34.560115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.559939 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-serving-cert\") pod \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " Apr 17 11:20:34.560115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.559964 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5mk4\" (UniqueName: \"kubernetes.io/projected/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-kube-api-access-p5mk4\") pod \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " Apr 17 11:20:34.560115 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.559993 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-oauth-config\") pod \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\" (UID: \"f7c04e64-e68a-4f6e-b9d3-081eb60fe820\") " Apr 17 11:20:34.560378 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.560284 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-service-ca" (OuterVolumeSpecName: "service-ca") pod "f7c04e64-e68a-4f6e-b9d3-081eb60fe820" (UID: "f7c04e64-e68a-4f6e-b9d3-081eb60fe820"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:34.560378 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.560290 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f7c04e64-e68a-4f6e-b9d3-081eb60fe820" (UID: "f7c04e64-e68a-4f6e-b9d3-081eb60fe820"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:34.560494 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.560387 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-config" (OuterVolumeSpecName: "console-config") pod "f7c04e64-e68a-4f6e-b9d3-081eb60fe820" (UID: "f7c04e64-e68a-4f6e-b9d3-081eb60fe820"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:34.562405 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.562376 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f7c04e64-e68a-4f6e-b9d3-081eb60fe820" (UID: "f7c04e64-e68a-4f6e-b9d3-081eb60fe820"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:34.562520 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.562396 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f7c04e64-e68a-4f6e-b9d3-081eb60fe820" (UID: "f7c04e64-e68a-4f6e-b9d3-081eb60fe820"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:34.562520 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.562488 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-kube-api-access-p5mk4" (OuterVolumeSpecName: "kube-api-access-p5mk4") pod "f7c04e64-e68a-4f6e-b9d3-081eb60fe820" (UID: "f7c04e64-e68a-4f6e-b9d3-081eb60fe820"). InnerVolumeSpecName "kube-api-access-p5mk4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:34.661476 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.661432 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:34.661476 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.661468 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p5mk4\" (UniqueName: \"kubernetes.io/projected/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-kube-api-access-p5mk4\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:34.661476 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.661481 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-oauth-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:34.661714 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.661495 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-oauth-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:34.661714 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.661509 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-service-ca\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:34.661714 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:34.661523 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c04e64-e68a-4f6e-b9d3-081eb60fe820-console-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:35.163647 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.163620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86b69cd8b5-lrgbb_f7c04e64-e68a-4f6e-b9d3-081eb60fe820/console/0.log" Apr 17 11:20:35.164122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.163660 2575 generic.go:358] "Generic (PLEG): container finished" podID="f7c04e64-e68a-4f6e-b9d3-081eb60fe820" containerID="86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72" exitCode=2 Apr 17 11:20:35.164122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.163738 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86b69cd8b5-lrgbb" Apr 17 11:20:35.164122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.163748 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86b69cd8b5-lrgbb" event={"ID":"f7c04e64-e68a-4f6e-b9d3-081eb60fe820","Type":"ContainerDied","Data":"86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72"} Apr 17 11:20:35.164122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.163786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86b69cd8b5-lrgbb" event={"ID":"f7c04e64-e68a-4f6e-b9d3-081eb60fe820","Type":"ContainerDied","Data":"3e5a16547c7201c2ec222dd9992a2fafc2dde5b49945fb671b88f187f0574a94"} Apr 17 11:20:35.164122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.163804 2575 scope.go:117] "RemoveContainer" containerID="86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72" Apr 17 11:20:35.172471 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.172448 2575 scope.go:117] "RemoveContainer" containerID="86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72" Apr 17 11:20:35.172848 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:35.172823 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72\": container with ID starting with 86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72 not found: ID does not exist" containerID="86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72" Apr 17 11:20:35.172944 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.172856 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72"} err="failed to get container status \"86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72\": rpc error: code = NotFound desc = could not find container \"86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72\": container with ID starting with 86000136648ef045e825e0a1aabcd0fa56841baf1044dfa568ebe5bc97122a72 not found: ID does not exist" Apr 17 11:20:35.185286 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.185254 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-86b69cd8b5-lrgbb"] Apr 17 11:20:35.191575 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.191541 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-86b69cd8b5-lrgbb"] Apr 17 11:20:35.342150 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:35.342107 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c04e64-e68a-4f6e-b9d3-081eb60fe820" path="/var/lib/kubelet/pods/f7c04e64-e68a-4f6e-b9d3-081eb60fe820/volumes" Apr 17 11:20:45.329289 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329254 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:45.329817 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329741 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="alertmanager" containerID="cri-o://8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9" gracePeriod=120 Apr 17 11:20:45.329956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329812 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-metric" containerID="cri-o://e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9" gracePeriod=120 Apr 17 11:20:45.329956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329849 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy" containerID="cri-o://0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9" gracePeriod=120 Apr 17 11:20:45.329956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329860 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="prom-label-proxy" containerID="cri-o://cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66" gracePeriod=120 Apr 17 11:20:45.329956 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329812 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-web" containerID="cri-o://1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55" gracePeriod=120 Apr 17 11:20:45.330162 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:45.329984 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="config-reloader" containerID="cri-o://6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44" gracePeriod=120 Apr 17 11:20:46.201949 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.201911 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66" exitCode=0 Apr 17 11:20:46.201949 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.201941 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9" exitCode=0 Apr 17 11:20:46.201949 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.201947 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44" exitCode=0 Apr 17 11:20:46.201949 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.201952 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9" exitCode=0 Apr 17 11:20:46.202239 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.201985 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66"} Apr 17 11:20:46.202239 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.202023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9"} Apr 17 11:20:46.202239 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.202036 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44"} Apr 17 11:20:46.202239 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.202048 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9"} Apr 17 11:20:46.580070 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.580042 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:46.663504 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663464 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663504 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663504 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-cluster-tls-config\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663524 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663568 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663599 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbdnt\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-kube-api-access-lbdnt\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663616 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-out\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663658 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-web-config\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663680 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-volume\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663704 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-web\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.663750 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663730 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-tls-assets\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.664230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663769 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-metrics-client-ca\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.664230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663797 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-main-db\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.664230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663831 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8e61ab94-a317-4cf2-92cb-d72cff701f92\" (UID: \"8e61ab94-a317-4cf2-92cb-d72cff701f92\") " Apr 17 11:20:46.664230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.663953 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:46.664230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.664153 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.666555 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.666406 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:20:46.666837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.666727 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:46.666837 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.666809 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.667769 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.667723 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:46.668415 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.668387 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.668887 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.668838 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-kube-api-access-lbdnt" (OuterVolumeSpecName: "kube-api-access-lbdnt") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "kube-api-access-lbdnt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:20:46.668887 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.668842 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.669265 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.669223 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.669365 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.669304 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.669852 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.669833 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-out" (OuterVolumeSpecName: "config-out") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:20:46.672315 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.672255 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.678736 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.678704 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-web-config" (OuterVolumeSpecName: "web-config") pod "8e61ab94-a317-4cf2-92cb-d72cff701f92" (UID: "8e61ab94-a317-4cf2-92cb-d72cff701f92"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765505 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-web-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765549 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-volume\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765559 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765568 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-tls-assets\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765579 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e61ab94-a317-4cf2-92cb-d72cff701f92-metrics-client-ca\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765588 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-alertmanager-main-db\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765597 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765607 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765602 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765616 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-cluster-tls-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765993 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765626 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e61ab94-a317-4cf2-92cb-d72cff701f92-secret-alertmanager-main-tls\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765993 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765634 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbdnt\" (UniqueName: \"kubernetes.io/projected/8e61ab94-a317-4cf2-92cb-d72cff701f92-kube-api-access-lbdnt\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:46.765993 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:46.765644 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e61ab94-a317-4cf2-92cb-d72cff701f92-config-out\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:20:47.208040 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208007 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9" exitCode=0 Apr 17 11:20:47.208040 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208035 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerID="1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55" exitCode=0 Apr 17 11:20:47.208256 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9"} Apr 17 11:20:47.208256 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208116 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.208256 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208126 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55"} Apr 17 11:20:47.208256 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e61ab94-a317-4cf2-92cb-d72cff701f92","Type":"ContainerDied","Data":"66274d7dfcf4947d19a10f15de703e11c22daf81bfa3f30819ef139960c54a7a"} Apr 17 11:20:47.208256 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.208156 2575 scope.go:117] "RemoveContainer" containerID="cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66" Apr 17 11:20:47.216421 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.216258 2575 scope.go:117] "RemoveContainer" containerID="e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9" Apr 17 11:20:47.224131 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.224112 2575 scope.go:117] "RemoveContainer" containerID="0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9" Apr 17 11:20:47.231135 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.231115 2575 scope.go:117] "RemoveContainer" containerID="1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55" Apr 17 11:20:47.238383 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.238353 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:47.239153 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.239133 2575 scope.go:117] "RemoveContainer" containerID="6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44" Apr 17 11:20:47.246011 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.245986 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:47.246909 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.246892 2575 scope.go:117] "RemoveContainer" containerID="8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9" Apr 17 11:20:47.254485 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.254464 2575 scope.go:117] "RemoveContainer" containerID="4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a" Apr 17 11:20:47.262371 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.262331 2575 scope.go:117] "RemoveContainer" containerID="cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66" Apr 17 11:20:47.262675 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.262646 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66\": container with ID starting with cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66 not found: ID does not exist" containerID="cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66" Apr 17 11:20:47.262761 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.262689 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66"} err="failed to get container status \"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66\": rpc error: code = NotFound desc = could not find container \"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66\": container with ID starting with cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66 not found: ID does not exist" Apr 17 11:20:47.262761 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.262716 2575 scope.go:117] "RemoveContainer" containerID="e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9" Apr 17 11:20:47.262991 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.262967 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9\": container with ID starting with e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9 not found: ID does not exist" containerID="e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9" Apr 17 11:20:47.263037 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.262997 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9"} err="failed to get container status \"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9\": rpc error: code = NotFound desc = could not find container \"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9\": container with ID starting with e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9 not found: ID does not exist" Apr 17 11:20:47.263037 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.263014 2575 scope.go:117] "RemoveContainer" containerID="0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9" Apr 17 11:20:47.263225 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.263208 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9\": container with ID starting with 0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9 not found: ID does not exist" containerID="0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9" Apr 17 11:20:47.263279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.263229 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9"} err="failed to get container status \"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9\": rpc error: code = NotFound desc = could not find container \"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9\": container with ID starting with 0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9 not found: ID does not exist" Apr 17 11:20:47.263279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.263247 2575 scope.go:117] "RemoveContainer" containerID="1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55" Apr 17 11:20:47.265888 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.265863 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55\": container with ID starting with 1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55 not found: ID does not exist" containerID="1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55" Apr 17 11:20:47.265980 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.265893 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55"} err="failed to get container status \"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55\": rpc error: code = NotFound desc = could not find container \"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55\": container with ID starting with 1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55 not found: ID does not exist" Apr 17 11:20:47.265980 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.265916 2575 scope.go:117] "RemoveContainer" containerID="6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44" Apr 17 11:20:47.266429 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.266382 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44\": container with ID starting with 6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44 not found: ID does not exist" containerID="6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44" Apr 17 11:20:47.266429 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.266406 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44"} err="failed to get container status \"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44\": rpc error: code = NotFound desc = could not find container \"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44\": container with ID starting with 6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44 not found: ID does not exist" Apr 17 11:20:47.266429 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.266421 2575 scope.go:117] "RemoveContainer" containerID="8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9" Apr 17 11:20:47.266702 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.266687 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9\": container with ID starting with 8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9 not found: ID does not exist" containerID="8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9" Apr 17 11:20:47.266739 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.266705 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9"} err="failed to get container status \"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9\": rpc error: code = NotFound desc = could not find container \"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9\": container with ID starting with 8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9 not found: ID does not exist" Apr 17 11:20:47.266739 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.266725 2575 scope.go:117] "RemoveContainer" containerID="4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a" Apr 17 11:20:47.266950 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.266934 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a\": container with ID starting with 4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a not found: ID does not exist" containerID="4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a" Apr 17 11:20:47.267001 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.266953 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a"} err="failed to get container status \"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a\": rpc error: code = NotFound desc = could not find container \"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a\": container with ID starting with 4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a not found: ID does not exist" Apr 17 11:20:47.267001 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.266965 2575 scope.go:117] "RemoveContainer" containerID="cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66" Apr 17 11:20:47.267167 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267153 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66"} err="failed to get container status \"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66\": rpc error: code = NotFound desc = could not find container \"cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66\": container with ID starting with cf10d768a4b7b87f2dbd569209342e63a041b8b842c488424fb8dddc57f54d66 not found: ID does not exist" Apr 17 11:20:47.267204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267168 2575 scope.go:117] "RemoveContainer" containerID="e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9" Apr 17 11:20:47.267352 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267325 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9"} err="failed to get container status \"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9\": rpc error: code = NotFound desc = could not find container \"e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9\": container with ID starting with e3242fe2901d9cedc1edd50686ea721ff1db523f4075b0fffc7810b81553a3b9 not found: ID does not exist" Apr 17 11:20:47.267394 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267360 2575 scope.go:117] "RemoveContainer" containerID="0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9" Apr 17 11:20:47.267668 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267641 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9"} err="failed to get container status \"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9\": rpc error: code = NotFound desc = could not find container \"0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9\": container with ID starting with 0d83bcd7551a1581506cbb16ca54a5ac6b39183d0a927a86d4f2c034ea58dea9 not found: ID does not exist" Apr 17 11:20:47.267668 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267669 2575 scope.go:117] "RemoveContainer" containerID="1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55" Apr 17 11:20:47.267930 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267910 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55"} err="failed to get container status \"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55\": rpc error: code = NotFound desc = could not find container \"1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55\": container with ID starting with 1323112a2cbb8ff0c36b0b07e2ee596079c204b7cd70f2f4f5102fc3d4661e55 not found: ID does not exist" Apr 17 11:20:47.268018 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.267931 2575 scope.go:117] "RemoveContainer" containerID="6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44" Apr 17 11:20:47.268232 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.268207 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44"} err="failed to get container status \"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44\": rpc error: code = NotFound desc = could not find container \"6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44\": container with ID starting with 6d7d6c2ceb458fa5deafce7f26b89a03e87acd36dc0ebc4ea8f805a62e24cc44 not found: ID does not exist" Apr 17 11:20:47.268364 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.268232 2575 scope.go:117] "RemoveContainer" containerID="8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9" Apr 17 11:20:47.268544 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.268522 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9"} err="failed to get container status \"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9\": rpc error: code = NotFound desc = could not find container \"8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9\": container with ID starting with 8f92f54a7f720761b26fa2e5be69951ed6e545e81226e90a82d2f239bb6b45e9 not found: ID does not exist" Apr 17 11:20:47.268666 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.268642 2575 scope.go:117] "RemoveContainer" containerID="4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a" Apr 17 11:20:47.268976 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.268951 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a"} err="failed to get container status \"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a\": rpc error: code = NotFound desc = could not find container \"4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a\": container with ID starting with 4d1233343f5ad61528b6c4e1c6306cf21d29dfe65190dfd34de5dd423979bb6a not found: ID does not exist" Apr 17 11:20:47.308457 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308417 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:47.308779 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308762 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="alertmanager" Apr 17 11:20:47.308779 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308780 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="alertmanager" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308797 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="init-config-reloader" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308804 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="init-config-reloader" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308810 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7c04e64-e68a-4f6e-b9d3-081eb60fe820" containerName="console" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308816 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c04e64-e68a-4f6e-b9d3-081eb60fe820" containerName="console" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308825 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="config-reloader" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308830 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="config-reloader" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308836 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-web" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308842 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-web" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308853 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308858 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308864 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="prom-label-proxy" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308870 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="prom-label-proxy" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308876 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-metric" Apr 17 11:20:47.308894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308881 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-metric" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308928 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-metric" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308938 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7c04e64-e68a-4f6e-b9d3-081eb60fe820" containerName="console" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308945 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="alertmanager" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308951 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308957 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="prom-label-proxy" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308966 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="config-reloader" Apr 17 11:20:47.309281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.308972 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" containerName="kube-rbac-proxy-web" Apr 17 11:20:47.312537 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.312519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.315405 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315381 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 11:20:47.315405 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315385 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 11:20:47.315604 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315589 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 11:20:47.315670 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315649 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 11:20:47.315744 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315722 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 11:20:47.315801 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 11:20:47.316019 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.315997 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 11:20:47.316121 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.316050 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 11:20:47.316121 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.316087 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zgsh8\"" Apr 17 11:20:47.322799 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.322778 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 11:20:47.330863 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.330835 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:47.343039 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.343006 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e61ab94-a317-4cf2-92cb-d72cff701f92" path="/var/lib/kubelet/pods/8e61ab94-a317-4cf2-92cb-d72cff701f92/volumes" Apr 17 11:20:47.371996 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.371957 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a62de1e-19c9-42d2-b22e-7772bf535278-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.371996 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.371995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-config-volume\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372124 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8a62de1e-19c9-42d2-b22e-7772bf535278-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372323 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372323 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372323 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8a62de1e-19c9-42d2-b22e-7772bf535278-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372437 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372360 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-web-config\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372437 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372377 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8a62de1e-19c9-42d2-b22e-7772bf535278-config-out\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372437 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372395 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a62de1e-19c9-42d2-b22e-7772bf535278-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.372525 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.372437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b92s2\" (UniqueName: \"kubernetes.io/projected/8a62de1e-19c9-42d2-b22e-7772bf535278-kube-api-access-b92s2\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.473718 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.473718 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.473718 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473709 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8a62de1e-19c9-42d2-b22e-7772bf535278-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473773 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8a62de1e-19c9-42d2-b22e-7772bf535278-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-web-config\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8a62de1e-19c9-42d2-b22e-7772bf535278-config-out\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a62de1e-19c9-42d2-b22e-7772bf535278-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.473984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b92s2\" (UniqueName: \"kubernetes.io/projected/8a62de1e-19c9-42d2-b22e-7772bf535278-kube-api-access-b92s2\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.474025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a62de1e-19c9-42d2-b22e-7772bf535278-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.474052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-config-volume\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.474083 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.474521 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.474209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8a62de1e-19c9-42d2-b22e-7772bf535278-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.475513 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.475439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a62de1e-19c9-42d2-b22e-7772bf535278-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.475615 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.475547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8a62de1e-19c9-42d2-b22e-7772bf535278-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.476971 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.476946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8a62de1e-19c9-42d2-b22e-7772bf535278-config-out\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.477088 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.477017 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.477611 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.477249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.477611 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.477529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.477611 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.477560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-web-config\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.477834 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.477768 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.478072 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.478049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-config-volume\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.478185 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.478168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8a62de1e-19c9-42d2-b22e-7772bf535278-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.478864 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.478846 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8a62de1e-19c9-42d2-b22e-7772bf535278-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.483295 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.483269 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b92s2\" (UniqueName: \"kubernetes.io/projected/8a62de1e-19c9-42d2-b22e-7772bf535278-kube-api-access-b92s2\") pod \"alertmanager-main-0\" (UID: \"8a62de1e-19c9-42d2-b22e-7772bf535278\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.622754 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.622709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 11:20:47.769911 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:47.769874 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 11:20:47.771827 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:47.771801 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a62de1e_19c9_42d2_b22e_7772bf535278.slice/crio-88f70c31c9fabcc0f84cc0e25a27f2fecee23b593194a9c95c6fac457373926a WatchSource:0}: Error finding container 88f70c31c9fabcc0f84cc0e25a27f2fecee23b593194a9c95c6fac457373926a: Status 404 returned error can't find the container with id 88f70c31c9fabcc0f84cc0e25a27f2fecee23b593194a9c95c6fac457373926a Apr 17 11:20:47.815548 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:47.814189 2575 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a62de1e_19c9_42d2_b22e_7772bf535278.slice/crio-b35f1e00011ad5096f0fb2448ffa61552f537124072bf3494ebb81490a7b96b6.scope\": RecentStats: unable to find data in memory cache]" Apr 17 11:20:48.214331 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:48.214296 2575 generic.go:358] "Generic (PLEG): container finished" podID="8a62de1e-19c9-42d2-b22e-7772bf535278" containerID="b35f1e00011ad5096f0fb2448ffa61552f537124072bf3494ebb81490a7b96b6" exitCode=0 Apr 17 11:20:48.214531 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:48.214388 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerDied","Data":"b35f1e00011ad5096f0fb2448ffa61552f537124072bf3494ebb81490a7b96b6"} Apr 17 11:20:48.214531 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:48.214424 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"88f70c31c9fabcc0f84cc0e25a27f2fecee23b593194a9c95c6fac457373926a"} Apr 17 11:20:49.221708 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.221669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"7907af37994bd3805cfbec238ee5de955837a1d653a3a2139538d4c57264090d"} Apr 17 11:20:49.221708 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.221713 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"ae9acaa31d31a24895405cc20b6707f70f696646695c74625dbeb4ed0c5c7dd8"} Apr 17 11:20:49.222145 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.221722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"0410286c9f8b5e015ac44c4b961cb9ae192aebed2f44dcbb9a5e320120b56990"} Apr 17 11:20:49.222145 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.221731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"253570717d013008005b27a56db1d46b38dfeedc5028981dc87c0a036d9ca332"} Apr 17 11:20:49.222145 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.221742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"66ebbfaad7c18f21ce5dd7739697ee08f57606113840ec75cc4a8270e2fa1a6a"} Apr 17 11:20:49.222145 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.221754 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8a62de1e-19c9-42d2-b22e-7772bf535278","Type":"ContainerStarted","Data":"88639675b47131ed2ffd61a3f24b9f5ff816bad9f9ab457c4f40c36afd93d08d"} Apr 17 11:20:49.258468 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.258397 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.25837661 podStartE2EDuration="2.25837661s" podCreationTimestamp="2026-04-17 11:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:49.25578373 +0000 UTC m=+278.425590795" watchObservedRunningTime="2026-04-17 11:20:49.25837661 +0000 UTC m=+278.428183669" Apr 17 11:20:49.344157 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.344125 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp"] Apr 17 11:20:49.347767 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.347745 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.350320 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.350291 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 11:20:49.350491 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.350321 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 11:20:49.351904 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.351883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 11:20:49.352462 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.352444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 11:20:49.352672 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.352648 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-tnx4f\"" Apr 17 11:20:49.352762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.352744 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 11:20:49.357479 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.357122 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 11:20:49.358428 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.358382 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp"] Apr 17 11:20:49.391085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-serving-certs-ca-bundle\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-metrics-client-ca\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsh4h\" (UniqueName: \"kubernetes.io/projected/65ddecc9-e83c-4ead-b118-2a8d4c960974-kube-api-access-bsh4h\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-secret-telemeter-client\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-telemeter-client-tls\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-federate-client-tls\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.391665 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.391359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492552 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492471 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-metrics-client-ca\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492552 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsh4h\" (UniqueName: \"kubernetes.io/projected/65ddecc9-e83c-4ead-b118-2a8d4c960974-kube-api-access-bsh4h\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492730 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-secret-telemeter-client\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492767 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-telemeter-client-tls\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492800 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492833 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-federate-client-tls\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492877 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.492931 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.492876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-serving-certs-ca-bundle\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.493255 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.493221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-metrics-client-ca\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.493612 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.493590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-serving-certs-ca-bundle\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.493932 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.493908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65ddecc9-e83c-4ead-b118-2a8d4c960974-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.495328 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.495300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.495461 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.495431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-secret-telemeter-client\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.495532 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.495513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-telemeter-client-tls\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.495584 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.495514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/65ddecc9-e83c-4ead-b118-2a8d4c960974-federate-client-tls\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.500851 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.500832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsh4h\" (UniqueName: \"kubernetes.io/projected/65ddecc9-e83c-4ead-b118-2a8d4c960974-kube-api-access-bsh4h\") pod \"telemeter-client-7c5d4cd4-kvnpp\" (UID: \"65ddecc9-e83c-4ead-b118-2a8d4c960974\") " pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.659804 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.659760 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" Apr 17 11:20:49.795225 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:49.795196 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp"] Apr 17 11:20:49.797362 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:49.797311 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65ddecc9_e83c_4ead_b118_2a8d4c960974.slice/crio-c8d869062936e11405781decfbe0aa3f5c415efb579c1447fead353328d50a53 WatchSource:0}: Error finding container c8d869062936e11405781decfbe0aa3f5c415efb579c1447fead353328d50a53: Status 404 returned error can't find the container with id c8d869062936e11405781decfbe0aa3f5c415efb579c1447fead353328d50a53 Apr 17 11:20:50.226013 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:50.225970 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" event={"ID":"65ddecc9-e83c-4ead-b118-2a8d4c960974","Type":"ContainerStarted","Data":"c8d869062936e11405781decfbe0aa3f5c415efb579c1447fead353328d50a53"} Apr 17 11:20:50.762739 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:20:50.762690 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ngx5g" podUID="d4bb7b6c-7fd2-4a72-be8b-724128cbea39" Apr 17 11:20:51.229759 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:51.229730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ngx5g" Apr 17 11:20:52.234647 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.234609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" event={"ID":"65ddecc9-e83c-4ead-b118-2a8d4c960974","Type":"ContainerStarted","Data":"5b80b47dce6e865e9e54f41073b2407af924be829849ab181ce6885498c55ce6"} Apr 17 11:20:52.234647 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.234651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" event={"ID":"65ddecc9-e83c-4ead-b118-2a8d4c960974","Type":"ContainerStarted","Data":"c7b193867907db27b36140b8027293405bc198de6124e3addb8c5b4326efc589"} Apr 17 11:20:52.235139 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.234665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" event={"ID":"65ddecc9-e83c-4ead-b118-2a8d4c960974","Type":"ContainerStarted","Data":"63532fa39b929ce2ef8e45d7f10da1171b960b158e4f0ff09ca82843d67d6fb8"} Apr 17 11:20:52.258099 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.258033 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7c5d4cd4-kvnpp" podStartSLOduration=1.6699417460000001 podStartE2EDuration="3.258018339s" podCreationTimestamp="2026-04-17 11:20:49 +0000 UTC" firstStartedPulling="2026-04-17 11:20:49.799693152 +0000 UTC m=+278.969500195" lastFinishedPulling="2026-04-17 11:20:51.387769745 +0000 UTC m=+280.557576788" observedRunningTime="2026-04-17 11:20:52.256115462 +0000 UTC m=+281.425922552" watchObservedRunningTime="2026-04-17 11:20:52.258018339 +0000 UTC m=+281.427825403" Apr 17 11:20:52.948539 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.948489 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65f6977878-kfrmh"] Apr 17 11:20:52.955801 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.955769 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:52.977948 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:52.977911 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65f6977878-kfrmh"] Apr 17 11:20:53.029811 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.029770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-oauth-config\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.029982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.029820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mg2\" (UniqueName: \"kubernetes.io/projected/b7f210c7-8468-4e8e-abc8-743189fb5c30-kube-api-access-d2mg2\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.029982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.029879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-trusted-ca-bundle\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.029982 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.029938 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-service-ca\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.030142 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.029984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-config\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.030142 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.030044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-serving-cert\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.030142 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.030097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-oauth-serving-cert\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131223 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-oauth-config\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131247 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mg2\" (UniqueName: \"kubernetes.io/projected/b7f210c7-8468-4e8e-abc8-743189fb5c30-kube-api-access-d2mg2\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-trusted-ca-bundle\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-service-ca\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-config\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-serving-cert\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.131746 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.131488 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-oauth-serving-cert\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.132202 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.132180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-oauth-serving-cert\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.132295 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.132221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-service-ca\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.132295 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.132251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-config\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.132295 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.132272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-trusted-ca-bundle\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.133769 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.133749 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-oauth-config\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.133928 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.133909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-serving-cert\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.141774 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.141748 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mg2\" (UniqueName: \"kubernetes.io/projected/b7f210c7-8468-4e8e-abc8-743189fb5c30-kube-api-access-d2mg2\") pod \"console-65f6977878-kfrmh\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.265878 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.265784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:20:53.396732 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:53.396698 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65f6977878-kfrmh"] Apr 17 11:20:53.399967 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:53.399935 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7f210c7_8468_4e8e_abc8_743189fb5c30.slice/crio-0ef177ab53df3c26dad839a84a99bb5df7fda8129560f620c05d23672e33e034 WatchSource:0}: Error finding container 0ef177ab53df3c26dad839a84a99bb5df7fda8129560f620c05d23672e33e034: Status 404 returned error can't find the container with id 0ef177ab53df3c26dad839a84a99bb5df7fda8129560f620c05d23672e33e034 Apr 17 11:20:54.142052 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.142021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:20:54.144316 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.144291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d4bb7b6c-7fd2-4a72-be8b-724128cbea39-metrics-tls\") pod \"dns-default-ngx5g\" (UID: \"d4bb7b6c-7fd2-4a72-be8b-724128cbea39\") " pod="openshift-dns/dns-default-ngx5g" Apr 17 11:20:54.233053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.233017 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v5zg9\"" Apr 17 11:20:54.240761 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.240735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ngx5g" Apr 17 11:20:54.242477 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.242450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:20:54.242615 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.242520 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f6977878-kfrmh" event={"ID":"b7f210c7-8468-4e8e-abc8-743189fb5c30","Type":"ContainerStarted","Data":"31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345"} Apr 17 11:20:54.242615 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.242553 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f6977878-kfrmh" event={"ID":"b7f210c7-8468-4e8e-abc8-743189fb5c30","Type":"ContainerStarted","Data":"0ef177ab53df3c26dad839a84a99bb5df7fda8129560f620c05d23672e33e034"} Apr 17 11:20:54.245094 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.245064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3d82faa5-4dae-416a-8e2c-337d3966cdd0-cert\") pod \"ingress-canary-7dssr\" (UID: \"3d82faa5-4dae-416a-8e2c-337d3966cdd0\") " pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:20:54.259370 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.259281 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65f6977878-kfrmh" podStartSLOduration=2.259259728 podStartE2EDuration="2.259259728s" podCreationTimestamp="2026-04-17 11:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:20:54.259044419 +0000 UTC m=+283.428851543" watchObservedRunningTime="2026-04-17 11:20:54.259259728 +0000 UTC m=+283.429066794" Apr 17 11:20:54.369028 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.369001 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ngx5g"] Apr 17 11:20:54.371261 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:54.371226 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4bb7b6c_7fd2_4a72_be8b_724128cbea39.slice/crio-8972654c5ada64b9f6d4a193dbf8875cc8efa119978fcacdbacfee80acce429d WatchSource:0}: Error finding container 8972654c5ada64b9f6d4a193dbf8875cc8efa119978fcacdbacfee80acce429d: Status 404 returned error can't find the container with id 8972654c5ada64b9f6d4a193dbf8875cc8efa119978fcacdbacfee80acce429d Apr 17 11:20:54.441730 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.441641 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5xzqf\"" Apr 17 11:20:54.449327 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.449299 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7dssr" Apr 17 11:20:54.573156 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:54.573122 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7dssr"] Apr 17 11:20:54.576525 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:20:54.576494 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d82faa5_4dae_416a_8e2c_337d3966cdd0.slice/crio-0c62ea518912ce75d37a1732e533cc73f92f892c2d7f3e6791b3b97c91a9880b WatchSource:0}: Error finding container 0c62ea518912ce75d37a1732e533cc73f92f892c2d7f3e6791b3b97c91a9880b: Status 404 returned error can't find the container with id 0c62ea518912ce75d37a1732e533cc73f92f892c2d7f3e6791b3b97c91a9880b Apr 17 11:20:55.249620 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:55.249573 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ngx5g" event={"ID":"d4bb7b6c-7fd2-4a72-be8b-724128cbea39","Type":"ContainerStarted","Data":"8972654c5ada64b9f6d4a193dbf8875cc8efa119978fcacdbacfee80acce429d"} Apr 17 11:20:55.250960 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:55.250929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7dssr" event={"ID":"3d82faa5-4dae-416a-8e2c-337d3966cdd0","Type":"ContainerStarted","Data":"0c62ea518912ce75d37a1732e533cc73f92f892c2d7f3e6791b3b97c91a9880b"} Apr 17 11:20:57.260319 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:57.260277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ngx5g" event={"ID":"d4bb7b6c-7fd2-4a72-be8b-724128cbea39","Type":"ContainerStarted","Data":"34e36deae44db6aa5674ebab32032dcef44503cc9551ea12e39ae003670a61eb"} Apr 17 11:20:57.260319 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:57.260319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ngx5g" event={"ID":"d4bb7b6c-7fd2-4a72-be8b-724128cbea39","Type":"ContainerStarted","Data":"f8d8697fed49929cc6612e98b1214c94a46100456affe7a374512b43e94b8452"} Apr 17 11:20:57.260884 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:57.260366 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ngx5g" Apr 17 11:20:57.261687 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:57.261658 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7dssr" event={"ID":"3d82faa5-4dae-416a-8e2c-337d3966cdd0","Type":"ContainerStarted","Data":"56b4c77b53e53497ca22b47a3a8657c1550f6b2505d1834dafebb35fa9802b94"} Apr 17 11:20:57.283382 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:57.283318 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ngx5g" podStartSLOduration=251.277436878 podStartE2EDuration="4m13.283302112s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:20:54.372937407 +0000 UTC m=+283.542744455" lastFinishedPulling="2026-04-17 11:20:56.378802646 +0000 UTC m=+285.548609689" observedRunningTime="2026-04-17 11:20:57.28148584 +0000 UTC m=+286.451292905" watchObservedRunningTime="2026-04-17 11:20:57.283302112 +0000 UTC m=+286.453109177" Apr 17 11:20:57.302585 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:20:57.302527 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7dssr" podStartSLOduration=251.498802924 podStartE2EDuration="4m13.302508659s" podCreationTimestamp="2026-04-17 11:16:44 +0000 UTC" firstStartedPulling="2026-04-17 11:20:54.578409422 +0000 UTC m=+283.748216469" lastFinishedPulling="2026-04-17 11:20:56.382115159 +0000 UTC m=+285.551922204" observedRunningTime="2026-04-17 11:20:57.301713034 +0000 UTC m=+286.471520109" watchObservedRunningTime="2026-04-17 11:20:57.302508659 +0000 UTC m=+286.472315724" Apr 17 11:21:03.265943 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:03.265897 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:21:03.266335 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:03.266047 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:21:03.270920 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:03.270897 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:21:03.286404 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:03.286375 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:21:03.333715 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:03.333666 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5979589c-56kth"] Apr 17 11:21:07.267489 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:07.267453 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ngx5g" Apr 17 11:21:11.224483 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:11.224454 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:21:11.224948 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:11.224925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:21:11.233915 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:11.233887 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:21:11.234601 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:11.234576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:21:11.237554 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:11.237527 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 11:21:28.356995 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.356949 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7f5979589c-56kth" podUID="c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" containerName="console" containerID="cri-o://626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3" gracePeriod=15 Apr 17 11:21:28.601277 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.601253 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5979589c-56kth_c12f1ae0-5ee4-49a4-a938-141db0c1ddbb/console/0.log" Apr 17 11:21:28.601445 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.601315 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:21:28.653046 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.652953 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-config\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653046 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653007 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-oauth-serving-cert\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653046 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653032 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-trusted-ca-bundle\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653288 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653206 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvljp\" (UniqueName: \"kubernetes.io/projected/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-kube-api-access-qvljp\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653288 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653254 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-serving-cert\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653424 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653300 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-config" (OuterVolumeSpecName: "console-config") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:28.653424 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653313 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-service-ca\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653424 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653386 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-oauth-config\") pod \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\" (UID: \"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb\") " Apr 17 11:21:28.653578 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653447 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:28.653578 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653492 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:28.653692 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653651 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:28.653692 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653671 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-oauth-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:28.653692 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653685 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-trusted-ca-bundle\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:28.653842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.653739 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-service-ca" (OuterVolumeSpecName: "service-ca") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:21:28.655414 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.655387 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:21:28.655525 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.655433 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-kube-api-access-qvljp" (OuterVolumeSpecName: "kube-api-access-qvljp") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "kube-api-access-qvljp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:21:28.655565 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.655522 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" (UID: "c12f1ae0-5ee4-49a4-a938-141db0c1ddbb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:21:28.754333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.754289 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:28.754333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.754324 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-service-ca\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:28.754333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.754363 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-console-oauth-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:28.754333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:28.754373 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvljp\" (UniqueName: \"kubernetes.io/projected/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb-kube-api-access-qvljp\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:21:29.360786 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.360762 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f5979589c-56kth_c12f1ae0-5ee4-49a4-a938-141db0c1ddbb/console/0.log" Apr 17 11:21:29.361236 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.360802 2575 generic.go:358] "Generic (PLEG): container finished" podID="c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" containerID="626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3" exitCode=2 Apr 17 11:21:29.361236 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.360873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5979589c-56kth" event={"ID":"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb","Type":"ContainerDied","Data":"626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3"} Apr 17 11:21:29.361236 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.360885 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f5979589c-56kth" Apr 17 11:21:29.361236 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.360900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f5979589c-56kth" event={"ID":"c12f1ae0-5ee4-49a4-a938-141db0c1ddbb","Type":"ContainerDied","Data":"a6f0cee65872d5c52822ebfa54f7c15cf1d68a6b6709d1206bdfe51a4ae9228f"} Apr 17 11:21:29.361236 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.360915 2575 scope.go:117] "RemoveContainer" containerID="626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3" Apr 17 11:21:29.369047 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.368891 2575 scope.go:117] "RemoveContainer" containerID="626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3" Apr 17 11:21:29.369171 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:21:29.369154 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3\": container with ID starting with 626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3 not found: ID does not exist" containerID="626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3" Apr 17 11:21:29.369214 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.369179 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3"} err="failed to get container status \"626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3\": rpc error: code = NotFound desc = could not find container \"626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3\": container with ID starting with 626e3b84eade20e740c8c1fa311f02555ee8d860bfbb95db51b07d9fcfdd79a3 not found: ID does not exist" Apr 17 11:21:29.377499 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.377476 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f5979589c-56kth"] Apr 17 11:21:29.382760 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:29.382737 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f5979589c-56kth"] Apr 17 11:21:31.341899 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:21:31.341853 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" path="/var/lib/kubelet/pods/c12f1ae0-5ee4-49a4-a938-141db0c1ddbb/volumes" Apr 17 11:22:09.057175 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.057094 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b955bd79c-sp68z"] Apr 17 11:22:09.057673 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.057451 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" containerName="console" Apr 17 11:22:09.057673 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.057464 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" containerName="console" Apr 17 11:22:09.057673 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.057553 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c12f1ae0-5ee4-49a4-a938-141db0c1ddbb" containerName="console" Apr 17 11:22:09.060846 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.060822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.081198 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.081165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b955bd79c-sp68z"] Apr 17 11:22:09.210773 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.210738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-oauth-config\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.210773 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.210781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-config\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.211015 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.210894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-oauth-serving-cert\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.211015 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.210960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-trusted-ca-bundle\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.211015 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.210992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-serving-cert\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.211130 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.211029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-service-ca\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.211130 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.211049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfz5b\" (UniqueName: \"kubernetes.io/projected/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-kube-api-access-gfz5b\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312126 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312023 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-trusted-ca-bundle\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312126 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312077 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-serving-cert\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312126 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-service-ca\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312455 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfz5b\" (UniqueName: \"kubernetes.io/projected/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-kube-api-access-gfz5b\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312455 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-oauth-config\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312455 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-config\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.312455 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-oauth-serving-cert\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.313015 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.312985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-service-ca\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.313121 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.313024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-config\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.313184 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.313133 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-oauth-serving-cert\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.313243 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.313180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-trusted-ca-bundle\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.314807 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.314777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-oauth-config\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.314944 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.314921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-serving-cert\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.320285 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.320265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfz5b\" (UniqueName: \"kubernetes.io/projected/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-kube-api-access-gfz5b\") pod \"console-b955bd79c-sp68z\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.371167 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.371118 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:09.495760 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.495736 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b955bd79c-sp68z"] Apr 17 11:22:09.498155 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:22:09.498123 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c35ad8_cad3_4bf1_81e2_8eb6926d8b2c.slice/crio-a277d0f969b085da9648ef6174752c939332de1660329a1c19372a289cf319af WatchSource:0}: Error finding container a277d0f969b085da9648ef6174752c939332de1660329a1c19372a289cf319af: Status 404 returned error can't find the container with id a277d0f969b085da9648ef6174752c939332de1660329a1c19372a289cf319af Apr 17 11:22:09.500015 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:09.500001 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 11:22:10.490090 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:10.490052 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b955bd79c-sp68z" event={"ID":"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c","Type":"ContainerStarted","Data":"1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5"} Apr 17 11:22:10.490090 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:10.490090 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b955bd79c-sp68z" event={"ID":"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c","Type":"ContainerStarted","Data":"a277d0f969b085da9648ef6174752c939332de1660329a1c19372a289cf319af"} Apr 17 11:22:10.507421 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:10.507365 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b955bd79c-sp68z" podStartSLOduration=1.5073322679999999 podStartE2EDuration="1.507332268s" podCreationTimestamp="2026-04-17 11:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:22:10.506034166 +0000 UTC m=+359.675841244" watchObservedRunningTime="2026-04-17 11:22:10.507332268 +0000 UTC m=+359.677139333" Apr 17 11:22:19.371425 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:19.371380 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:19.371854 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:19.371461 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:19.376228 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:19.376199 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:19.521137 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:19.521109 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:22:19.572488 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:19.572444 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65f6977878-kfrmh"] Apr 17 11:22:44.593213 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.593136 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65f6977878-kfrmh" podUID="b7f210c7-8468-4e8e-abc8-743189fb5c30" containerName="console" containerID="cri-o://31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345" gracePeriod=15 Apr 17 11:22:44.840778 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.840755 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65f6977878-kfrmh_b7f210c7-8468-4e8e-abc8-743189fb5c30/console/0.log" Apr 17 11:22:44.840920 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.840816 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:22:44.934328 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934233 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-config\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934328 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934291 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-oauth-config\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934328 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934327 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-oauth-serving-cert\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934605 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934385 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2mg2\" (UniqueName: \"kubernetes.io/projected/b7f210c7-8468-4e8e-abc8-743189fb5c30-kube-api-access-d2mg2\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934605 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934425 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-service-ca\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934605 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934456 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-serving-cert\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934605 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934484 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-trusted-ca-bundle\") pod \"b7f210c7-8468-4e8e-abc8-743189fb5c30\" (UID: \"b7f210c7-8468-4e8e-abc8-743189fb5c30\") " Apr 17 11:22:44.934801 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934708 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-config" (OuterVolumeSpecName: "console-config") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:44.934860 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934816 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:44.934918 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.934896 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-service-ca" (OuterVolumeSpecName: "service-ca") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:44.935277 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.935250 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:22:44.936603 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.936578 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:22:44.936706 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.936629 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:22:44.936706 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:44.936666 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7f210c7-8468-4e8e-abc8-743189fb5c30-kube-api-access-d2mg2" (OuterVolumeSpecName: "kube-api-access-d2mg2") pod "b7f210c7-8468-4e8e-abc8-743189fb5c30" (UID: "b7f210c7-8468-4e8e-abc8-743189fb5c30"). InnerVolumeSpecName "kube-api-access-d2mg2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:22:45.035462 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035413 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.035462 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035459 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-oauth-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.035462 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035470 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-oauth-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.035462 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035480 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2mg2\" (UniqueName: \"kubernetes.io/projected/b7f210c7-8468-4e8e-abc8-743189fb5c30-kube-api-access-d2mg2\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.035734 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035489 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-service-ca\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.035734 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035498 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b7f210c7-8468-4e8e-abc8-743189fb5c30-console-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.035734 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.035506 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b7f210c7-8468-4e8e-abc8-743189fb5c30-trusted-ca-bundle\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:22:45.600351 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.600252 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65f6977878-kfrmh_b7f210c7-8468-4e8e-abc8-743189fb5c30/console/0.log" Apr 17 11:22:45.600351 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.600300 2575 generic.go:358] "Generic (PLEG): container finished" podID="b7f210c7-8468-4e8e-abc8-743189fb5c30" containerID="31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345" exitCode=2 Apr 17 11:22:45.600832 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.600382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f6977878-kfrmh" event={"ID":"b7f210c7-8468-4e8e-abc8-743189fb5c30","Type":"ContainerDied","Data":"31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345"} Apr 17 11:22:45.600832 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.600404 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65f6977878-kfrmh" Apr 17 11:22:45.600832 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.600428 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65f6977878-kfrmh" event={"ID":"b7f210c7-8468-4e8e-abc8-743189fb5c30","Type":"ContainerDied","Data":"0ef177ab53df3c26dad839a84a99bb5df7fda8129560f620c05d23672e33e034"} Apr 17 11:22:45.600832 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.600450 2575 scope.go:117] "RemoveContainer" containerID="31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345" Apr 17 11:22:45.608565 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.608547 2575 scope.go:117] "RemoveContainer" containerID="31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345" Apr 17 11:22:45.608854 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:22:45.608826 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345\": container with ID starting with 31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345 not found: ID does not exist" containerID="31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345" Apr 17 11:22:45.608908 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.608863 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345"} err="failed to get container status \"31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345\": rpc error: code = NotFound desc = could not find container \"31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345\": container with ID starting with 31a4e16c25d900ebdc5b601fb061b60264baa831a06d6918431a9fcc954eb345 not found: ID does not exist" Apr 17 11:22:45.622425 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.622382 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65f6977878-kfrmh"] Apr 17 11:22:45.625830 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:45.625804 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65f6977878-kfrmh"] Apr 17 11:22:47.342255 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:22:47.342212 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7f210c7-8468-4e8e-abc8-743189fb5c30" path="/var/lib/kubelet/pods/b7f210c7-8468-4e8e-abc8-743189fb5c30/volumes" Apr 17 11:23:04.470399 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.470360 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm"] Apr 17 11:23:04.470897 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.470750 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7f210c7-8468-4e8e-abc8-743189fb5c30" containerName="console" Apr 17 11:23:04.470897 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.470762 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7f210c7-8468-4e8e-abc8-743189fb5c30" containerName="console" Apr 17 11:23:04.470897 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.470811 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7f210c7-8468-4e8e-abc8-743189fb5c30" containerName="console" Apr 17 11:23:04.473928 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.473906 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.476395 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.476368 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 11:23:04.476524 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.476372 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-c2ghr\"" Apr 17 11:23:04.477049 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.477032 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 11:23:04.487438 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.487416 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm"] Apr 17 11:23:04.500996 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.500966 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.501116 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.501025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.501116 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.501097 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfcbr\" (UniqueName: \"kubernetes.io/projected/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-kube-api-access-qfcbr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.601839 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.601800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfcbr\" (UniqueName: \"kubernetes.io/projected/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-kube-api-access-qfcbr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.602033 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.601872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.602033 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.601911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.602276 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.602259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.602315 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.602300 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.611183 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.611156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfcbr\" (UniqueName: \"kubernetes.io/projected/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-kube-api-access-qfcbr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.783616 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.783504 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:04.914474 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:04.914449 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm"] Apr 17 11:23:04.916839 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:23:04.916809 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc08b3f06_bf30_48e3_aee2_fc60a69f5b21.slice/crio-0e502951e33f39efca84ca5a7d1ebc89fdc40d0a50c116a6f9269a69b9823413 WatchSource:0}: Error finding container 0e502951e33f39efca84ca5a7d1ebc89fdc40d0a50c116a6f9269a69b9823413: Status 404 returned error can't find the container with id 0e502951e33f39efca84ca5a7d1ebc89fdc40d0a50c116a6f9269a69b9823413 Apr 17 11:23:05.675727 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:05.675690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" event={"ID":"c08b3f06-bf30-48e3-aee2-fc60a69f5b21","Type":"ContainerStarted","Data":"0e502951e33f39efca84ca5a7d1ebc89fdc40d0a50c116a6f9269a69b9823413"} Apr 17 11:23:10.694247 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:10.694214 2575 generic.go:358] "Generic (PLEG): container finished" podID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerID="014d88bfcf6b2ba7c5575aa93f1b9440a49eb5a857624884f8d917b28383d98c" exitCode=0 Apr 17 11:23:10.694673 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:10.694284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" event={"ID":"c08b3f06-bf30-48e3-aee2-fc60a69f5b21","Type":"ContainerDied","Data":"014d88bfcf6b2ba7c5575aa93f1b9440a49eb5a857624884f8d917b28383d98c"} Apr 17 11:23:12.701975 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:12.701931 2575 generic.go:358] "Generic (PLEG): container finished" podID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerID="05c8467dfa5f9c86e44c867df737e32ae8fa2c59b6dc8526b7cdb5abd9af6b9b" exitCode=0 Apr 17 11:23:12.702333 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:12.702018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" event={"ID":"c08b3f06-bf30-48e3-aee2-fc60a69f5b21","Type":"ContainerDied","Data":"05c8467dfa5f9c86e44c867df737e32ae8fa2c59b6dc8526b7cdb5abd9af6b9b"} Apr 17 11:23:19.735257 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:19.735221 2575 generic.go:358] "Generic (PLEG): container finished" podID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerID="176c042543233d223fd9259fc035dcb18fbd31ffef24e26afd2a9cb6bf817dc5" exitCode=0 Apr 17 11:23:19.735690 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:19.735301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" event={"ID":"c08b3f06-bf30-48e3-aee2-fc60a69f5b21","Type":"ContainerDied","Data":"176c042543233d223fd9259fc035dcb18fbd31ffef24e26afd2a9cb6bf817dc5"} Apr 17 11:23:20.862954 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.862927 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:20.947747 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.947713 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-util\") pod \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " Apr 17 11:23:20.947747 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.947761 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfcbr\" (UniqueName: \"kubernetes.io/projected/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-kube-api-access-qfcbr\") pod \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " Apr 17 11:23:20.947990 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.947803 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-bundle\") pod \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\" (UID: \"c08b3f06-bf30-48e3-aee2-fc60a69f5b21\") " Apr 17 11:23:20.948354 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.948316 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-bundle" (OuterVolumeSpecName: "bundle") pod "c08b3f06-bf30-48e3-aee2-fc60a69f5b21" (UID: "c08b3f06-bf30-48e3-aee2-fc60a69f5b21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:23:20.950036 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.949995 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-kube-api-access-qfcbr" (OuterVolumeSpecName: "kube-api-access-qfcbr") pod "c08b3f06-bf30-48e3-aee2-fc60a69f5b21" (UID: "c08b3f06-bf30-48e3-aee2-fc60a69f5b21"). InnerVolumeSpecName "kube-api-access-qfcbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:23:20.951698 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:20.951669 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-util" (OuterVolumeSpecName: "util") pod "c08b3f06-bf30-48e3-aee2-fc60a69f5b21" (UID: "c08b3f06-bf30-48e3-aee2-fc60a69f5b21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 11:23:21.049233 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:21.049152 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-util\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:23:21.049233 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:21.049187 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qfcbr\" (UniqueName: \"kubernetes.io/projected/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-kube-api-access-qfcbr\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:23:21.049233 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:21.049196 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c08b3f06-bf30-48e3-aee2-fc60a69f5b21-bundle\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:23:21.742994 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:21.742905 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" event={"ID":"c08b3f06-bf30-48e3-aee2-fc60a69f5b21","Type":"ContainerDied","Data":"0e502951e33f39efca84ca5a7d1ebc89fdc40d0a50c116a6f9269a69b9823413"} Apr 17 11:23:21.742994 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:21.742944 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e502951e33f39efca84ca5a7d1ebc89fdc40d0a50c116a6f9269a69b9823413" Apr 17 11:23:21.742994 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:21.742950 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cgtstm" Apr 17 11:23:31.046924 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.046830 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-66kzw"] Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047310 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="extract" Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047331 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="extract" Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047376 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="pull" Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047385 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="pull" Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047400 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="util" Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047409 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="util" Apr 17 11:23:31.047505 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.047503 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c08b3f06-bf30-48e3-aee2-fc60a69f5b21" containerName="extract" Apr 17 11:23:31.095453 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.095417 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-66kzw"] Apr 17 11:23:31.095619 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.095541 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.098620 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.098588 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 11:23:31.098800 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.098594 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 11:23:31.098800 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.098653 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 11:23:31.098972 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.098873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 11:23:31.099033 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.099009 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-gjr8j\"" Apr 17 11:23:31.099119 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.099101 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 11:23:31.234834 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.234796 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwrw\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-kube-api-access-9kwrw\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.235032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.234854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.235032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.234962 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-cabundle0\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.335869 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.335826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-cabundle0\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.336070 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.335887 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwrw\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-kube-api-access-9kwrw\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.336070 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.335948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.336070 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.336055 2575 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 11:23:31.336070 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.336072 2575 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:23:31.336354 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.336083 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:23:31.336354 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.336100 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-66kzw: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 11:23:31.336354 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.336171 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates podName:97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e nodeName:}" failed. No retries permitted until 2026-04-17 11:23:31.836150372 +0000 UTC m=+441.005957422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates") pod "keda-operator-ffbb595cb-66kzw" (UID: "97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 11:23:31.336655 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.336631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-cabundle0\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.351678 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.351646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwrw\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-kube-api-access-9kwrw\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.468695 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.468655 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-df74r"] Apr 17 11:23:31.481120 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.481092 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.481592 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.481560 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-df74r"] Apr 17 11:23:31.483317 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.483294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 11:23:31.537819 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.537782 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.538219 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.538187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b063129f-066d-4899-afee-fb2e67667b69-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.538319 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.538262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrhk7\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-kube-api-access-jrhk7\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.638986 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.638884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b063129f-066d-4899-afee-fb2e67667b69-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.638986 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.638934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrhk7\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-kube-api-access-jrhk7\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.638988 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.639112 2575 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.639129 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.639147 2575 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.639166 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-df74r: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.639229 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates podName:b063129f-066d-4899-afee-fb2e67667b69 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:32.139210787 +0000 UTC m=+441.309017830 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates") pod "keda-metrics-apiserver-7c9f485588-df74r" (UID: "b063129f-066d-4899-afee-fb2e67667b69") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 11:23:31.639278 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.639259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/b063129f-066d-4899-afee-fb2e67667b69-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.654058 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.654024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrhk7\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-kube-api-access-jrhk7\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:31.773425 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.773394 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-7wvfm"] Apr 17 11:23:31.786815 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.786781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:31.787239 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.787100 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7wvfm"] Apr 17 11:23:31.788902 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.788877 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 11:23:31.841208 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.841169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:31.841415 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.841392 2575 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:23:31.841481 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.841417 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:23:31.841481 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.841432 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-66kzw: references non-existent secret key: ca.crt Apr 17 11:23:31.841583 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:31.841509 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates podName:97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e nodeName:}" failed. No retries permitted until 2026-04-17 11:23:32.841487384 +0000 UTC m=+442.011294430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates") pod "keda-operator-ffbb595cb-66kzw" (UID: "97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e") : references non-existent secret key: ca.crt Apr 17 11:23:31.941964 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.941865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-certificates\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:31.942158 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:31.941978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cdgh\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-kube-api-access-4cdgh\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.042670 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.042632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-certificates\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.042873 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.042708 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cdgh\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-kube-api-access-4cdgh\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.042873 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.042793 2575 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 17 11:23:32.042873 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.042824 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-7wvfm: secret "keda-admission-webhooks-certs" not found Apr 17 11:23:32.042998 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.042887 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-certificates podName:d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:32.542870488 +0000 UTC m=+441.712677535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-certificates") pod "keda-admission-cf49989db-7wvfm" (UID: "d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26") : secret "keda-admission-webhooks-certs" not found Apr 17 11:23:32.051915 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.051885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cdgh\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-kube-api-access-4cdgh\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.144117 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.144080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:32.144273 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.144225 2575 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:23:32.144273 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.144241 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:23:32.144273 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.144260 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-df74r: references non-existent secret key: tls.crt Apr 17 11:23:32.144405 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.144314 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates podName:b063129f-066d-4899-afee-fb2e67667b69 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:33.144298229 +0000 UTC m=+442.314105294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates") pod "keda-metrics-apiserver-7c9f485588-df74r" (UID: "b063129f-066d-4899-afee-fb2e67667b69") : references non-existent secret key: tls.crt Apr 17 11:23:32.547298 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.547251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-certificates\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.549763 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.549741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26-certificates\") pod \"keda-admission-cf49989db-7wvfm\" (UID: \"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26\") " pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.699558 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.699517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:32.828062 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.828030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-7wvfm"] Apr 17 11:23:32.830847 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:23:32.830816 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0e885c9_e8cc_4fdb_9a03_fd9a9a0e2c26.slice/crio-d68ff87087f56515654a0d4363c05d44a2c4914382a6753c40f27752d756f7d8 WatchSource:0}: Error finding container d68ff87087f56515654a0d4363c05d44a2c4914382a6753c40f27752d756f7d8: Status 404 returned error can't find the container with id d68ff87087f56515654a0d4363c05d44a2c4914382a6753c40f27752d756f7d8 Apr 17 11:23:32.850523 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:32.850482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:32.850708 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.850637 2575 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:23:32.850708 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.850658 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:23:32.850708 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.850668 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-66kzw: references non-existent secret key: ca.crt Apr 17 11:23:32.850814 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:32.850722 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates podName:97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e nodeName:}" failed. No retries permitted until 2026-04-17 11:23:34.850705523 +0000 UTC m=+444.020512588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates") pod "keda-operator-ffbb595cb-66kzw" (UID: "97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e") : references non-existent secret key: ca.crt Apr 17 11:23:33.153476 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:33.153373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:33.153861 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:33.153499 2575 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:23:33.153861 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:33.153512 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:23:33.153861 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:33.153548 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-df74r: references non-existent secret key: tls.crt Apr 17 11:23:33.153861 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:33.153596 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates podName:b063129f-066d-4899-afee-fb2e67667b69 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:35.153584373 +0000 UTC m=+444.323391417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates") pod "keda-metrics-apiserver-7c9f485588-df74r" (UID: "b063129f-066d-4899-afee-fb2e67667b69") : references non-existent secret key: tls.crt Apr 17 11:23:33.780460 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:33.780420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7wvfm" event={"ID":"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26","Type":"ContainerStarted","Data":"d68ff87087f56515654a0d4363c05d44a2c4914382a6753c40f27752d756f7d8"} Apr 17 11:23:34.790453 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:34.790408 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-7wvfm" event={"ID":"d0e885c9-e8cc-4fdb-9a03-fd9a9a0e2c26","Type":"ContainerStarted","Data":"ef170fbc1d72366525d6535142bf1278fe90dce5cfbb7356d664b2af31012955"} Apr 17 11:23:34.790996 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:34.790552 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:23:34.806818 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:34.806760 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-7wvfm" podStartSLOduration=2.710989135 podStartE2EDuration="3.80674253s" podCreationTimestamp="2026-04-17 11:23:31 +0000 UTC" firstStartedPulling="2026-04-17 11:23:32.832114979 +0000 UTC m=+442.001922025" lastFinishedPulling="2026-04-17 11:23:33.927868377 +0000 UTC m=+443.097675420" observedRunningTime="2026-04-17 11:23:34.804550336 +0000 UTC m=+443.974357396" watchObservedRunningTime="2026-04-17 11:23:34.80674253 +0000 UTC m=+443.976549594" Apr 17 11:23:34.870454 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:34.870413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:34.870641 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:34.870539 2575 secret.go:281] references non-existent secret key: ca.crt Apr 17 11:23:34.870641 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:34.870552 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 11:23:34.870641 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:34.870561 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-66kzw: references non-existent secret key: ca.crt Apr 17 11:23:34.870641 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:34.870604 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates podName:97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e nodeName:}" failed. No retries permitted until 2026-04-17 11:23:38.870591122 +0000 UTC m=+448.040398168 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates") pod "keda-operator-ffbb595cb-66kzw" (UID: "97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e") : references non-existent secret key: ca.crt Apr 17 11:23:35.174217 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:35.174174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:35.174434 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:35.174321 2575 secret.go:281] references non-existent secret key: tls.crt Apr 17 11:23:35.174434 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:35.174376 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 11:23:35.174434 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:35.174401 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-df74r: references non-existent secret key: tls.crt Apr 17 11:23:35.174558 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:23:35.174455 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates podName:b063129f-066d-4899-afee-fb2e67667b69 nodeName:}" failed. No retries permitted until 2026-04-17 11:23:39.174441211 +0000 UTC m=+448.344248254 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates") pod "keda-metrics-apiserver-7c9f485588-df74r" (UID: "b063129f-066d-4899-afee-fb2e67667b69") : references non-existent secret key: tls.crt Apr 17 11:23:38.908246 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:38.908208 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:38.910707 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:38.910685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e-certificates\") pod \"keda-operator-ffbb595cb-66kzw\" (UID: \"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e\") " pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:39.207054 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.206951 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:39.211026 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.211000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:39.213457 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.213431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/b063129f-066d-4899-afee-fb2e67667b69-certificates\") pod \"keda-metrics-apiserver-7c9f485588-df74r\" (UID: \"b063129f-066d-4899-afee-fb2e67667b69\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:39.292992 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.292952 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:39.349543 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.349462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-66kzw"] Apr 17 11:23:39.352805 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:23:39.352768 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e1eb0b_aaa0_4f3a_a7fc_65ace196d94e.slice/crio-2d1a6cbd993df19192cc352bd86c7524891db21c65a042bb3c679d17408eb168 WatchSource:0}: Error finding container 2d1a6cbd993df19192cc352bd86c7524891db21c65a042bb3c679d17408eb168: Status 404 returned error can't find the container with id 2d1a6cbd993df19192cc352bd86c7524891db21c65a042bb3c679d17408eb168 Apr 17 11:23:39.444965 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.444936 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-df74r"] Apr 17 11:23:39.447633 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:23:39.447604 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb063129f_066d_4899_afee_fb2e67667b69.slice/crio-81763ad35affe8706dae76bfe295ff376af79134e3c7f118739c79c9cb57fe8f WatchSource:0}: Error finding container 81763ad35affe8706dae76bfe295ff376af79134e3c7f118739c79c9cb57fe8f: Status 404 returned error can't find the container with id 81763ad35affe8706dae76bfe295ff376af79134e3c7f118739c79c9cb57fe8f Apr 17 11:23:39.809737 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.809691 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" event={"ID":"b063129f-066d-4899-afee-fb2e67667b69","Type":"ContainerStarted","Data":"81763ad35affe8706dae76bfe295ff376af79134e3c7f118739c79c9cb57fe8f"} Apr 17 11:23:39.810634 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:39.810607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" event={"ID":"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e","Type":"ContainerStarted","Data":"2d1a6cbd993df19192cc352bd86c7524891db21c65a042bb3c679d17408eb168"} Apr 17 11:23:43.832130 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:43.832075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" event={"ID":"b063129f-066d-4899-afee-fb2e67667b69","Type":"ContainerStarted","Data":"efc88e45b86028703c4d78b825c5b9ab7bebdf95d84336809b65b921b4af0d2a"} Apr 17 11:23:43.832671 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:43.832189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:43.833592 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:43.833564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" event={"ID":"97e1eb0b-aaa0-4f3a-a7fc-65ace196d94e","Type":"ContainerStarted","Data":"925c8013ac5573e99edbdae4fb82d24bdf65df449c998ee5b6be84c575859482"} Apr 17 11:23:43.833724 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:43.833665 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:23:43.862742 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:43.862691 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" podStartSLOduration=9.29014347 podStartE2EDuration="12.862674685s" podCreationTimestamp="2026-04-17 11:23:31 +0000 UTC" firstStartedPulling="2026-04-17 11:23:39.35393504 +0000 UTC m=+448.523742082" lastFinishedPulling="2026-04-17 11:23:42.92646625 +0000 UTC m=+452.096273297" observedRunningTime="2026-04-17 11:23:43.862633883 +0000 UTC m=+453.032440949" watchObservedRunningTime="2026-04-17 11:23:43.862674685 +0000 UTC m=+453.032481751" Apr 17 11:23:43.864563 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:43.864512 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" podStartSLOduration=9.39265793 podStartE2EDuration="12.864500292s" podCreationTimestamp="2026-04-17 11:23:31 +0000 UTC" firstStartedPulling="2026-04-17 11:23:39.44904097 +0000 UTC m=+448.618848012" lastFinishedPulling="2026-04-17 11:23:42.920883329 +0000 UTC m=+452.090690374" observedRunningTime="2026-04-17 11:23:43.848324526 +0000 UTC m=+453.018131592" watchObservedRunningTime="2026-04-17 11:23:43.864500292 +0000 UTC m=+453.034307356" Apr 17 11:23:54.841878 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:54.841848 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-df74r" Apr 17 11:23:55.796135 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:23:55.796102 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-7wvfm" Apr 17 11:24:04.839815 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:04.839783 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-66kzw" Apr 17 11:24:40.066363 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.066306 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-vlhpc"] Apr 17 11:24:40.070033 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.070008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.072329 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.072305 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 11:24:40.072503 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.072396 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-bw56v\"" Apr 17 11:24:40.072909 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.072886 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 11:24:40.072909 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.072907 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 11:24:40.081014 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.080986 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-vlhpc"] Apr 17 11:24:40.166331 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.166289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsfck\" (UniqueName: \"kubernetes.io/projected/3295d3b7-d5b3-43f4-af04-577ad46bebe0-kube-api-access-jsfck\") pod \"kserve-controller-manager-7dcb9f9f85-vlhpc\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.166551 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.166420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3295d3b7-d5b3-43f4-af04-577ad46bebe0-cert\") pod \"kserve-controller-manager-7dcb9f9f85-vlhpc\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.267487 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.267447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsfck\" (UniqueName: \"kubernetes.io/projected/3295d3b7-d5b3-43f4-af04-577ad46bebe0-kube-api-access-jsfck\") pod \"kserve-controller-manager-7dcb9f9f85-vlhpc\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.267682 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.267542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3295d3b7-d5b3-43f4-af04-577ad46bebe0-cert\") pod \"kserve-controller-manager-7dcb9f9f85-vlhpc\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.270032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.270006 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3295d3b7-d5b3-43f4-af04-577ad46bebe0-cert\") pod \"kserve-controller-manager-7dcb9f9f85-vlhpc\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.289474 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.289440 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsfck\" (UniqueName: \"kubernetes.io/projected/3295d3b7-d5b3-43f4-af04-577ad46bebe0-kube-api-access-jsfck\") pod \"kserve-controller-manager-7dcb9f9f85-vlhpc\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.381894 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.381860 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:40.544446 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:40.544416 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-vlhpc"] Apr 17 11:24:40.546191 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:24:40.546161 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3295d3b7_d5b3_43f4_af04_577ad46bebe0.slice/crio-1e12feb37d7360873063f955ae860162204e61e865e5caf2ea6df55eed071875 WatchSource:0}: Error finding container 1e12feb37d7360873063f955ae860162204e61e865e5caf2ea6df55eed071875: Status 404 returned error can't find the container with id 1e12feb37d7360873063f955ae860162204e61e865e5caf2ea6df55eed071875 Apr 17 11:24:41.026252 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:41.026214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" event={"ID":"3295d3b7-d5b3-43f4-af04-577ad46bebe0","Type":"ContainerStarted","Data":"1e12feb37d7360873063f955ae860162204e61e865e5caf2ea6df55eed071875"} Apr 17 11:24:43.035230 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:43.035189 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" event={"ID":"3295d3b7-d5b3-43f4-af04-577ad46bebe0","Type":"ContainerStarted","Data":"e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9"} Apr 17 11:24:43.035651 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:43.035320 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:24:43.054398 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:24:43.054334 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" podStartSLOduration=0.747992304 podStartE2EDuration="3.054314024s" podCreationTimestamp="2026-04-17 11:24:40 +0000 UTC" firstStartedPulling="2026-04-17 11:24:40.54762641 +0000 UTC m=+509.717433457" lastFinishedPulling="2026-04-17 11:24:42.853948121 +0000 UTC m=+512.023755177" observedRunningTime="2026-04-17 11:24:43.053076746 +0000 UTC m=+512.222883827" watchObservedRunningTime="2026-04-17 11:24:43.054314024 +0000 UTC m=+512.224121089" Apr 17 11:25:14.045524 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:14.045446 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:25:16.432401 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.432363 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-vlhpc"] Apr 17 11:25:16.432884 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.432633 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" podUID="3295d3b7-d5b3-43f4-af04-577ad46bebe0" containerName="manager" containerID="cri-o://e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9" gracePeriod=10 Apr 17 11:25:16.461485 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.461457 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-t6wc6"] Apr 17 11:25:16.466677 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.466645 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.474673 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.474629 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-t6wc6"] Apr 17 11:25:16.602296 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.602246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55ada3df-f067-4273-8df1-d131da59f289-cert\") pod \"kserve-controller-manager-7dcb9f9f85-t6wc6\" (UID: \"55ada3df-f067-4273-8df1-d131da59f289\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.602536 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.602364 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbdk\" (UniqueName: \"kubernetes.io/projected/55ada3df-f067-4273-8df1-d131da59f289-kube-api-access-tbbdk\") pod \"kserve-controller-manager-7dcb9f9f85-t6wc6\" (UID: \"55ada3df-f067-4273-8df1-d131da59f289\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.679923 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.679893 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:25:16.703313 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.703228 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55ada3df-f067-4273-8df1-d131da59f289-cert\") pod \"kserve-controller-manager-7dcb9f9f85-t6wc6\" (UID: \"55ada3df-f067-4273-8df1-d131da59f289\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.703313 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.703300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbdk\" (UniqueName: \"kubernetes.io/projected/55ada3df-f067-4273-8df1-d131da59f289-kube-api-access-tbbdk\") pod \"kserve-controller-manager-7dcb9f9f85-t6wc6\" (UID: \"55ada3df-f067-4273-8df1-d131da59f289\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.706125 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.706078 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/55ada3df-f067-4273-8df1-d131da59f289-cert\") pod \"kserve-controller-manager-7dcb9f9f85-t6wc6\" (UID: \"55ada3df-f067-4273-8df1-d131da59f289\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.715073 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.715035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbdk\" (UniqueName: \"kubernetes.io/projected/55ada3df-f067-4273-8df1-d131da59f289-kube-api-access-tbbdk\") pod \"kserve-controller-manager-7dcb9f9f85-t6wc6\" (UID: \"55ada3df-f067-4273-8df1-d131da59f289\") " pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.804048 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.804007 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3295d3b7-d5b3-43f4-af04-577ad46bebe0-cert\") pod \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " Apr 17 11:25:16.804228 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.804108 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsfck\" (UniqueName: \"kubernetes.io/projected/3295d3b7-d5b3-43f4-af04-577ad46bebe0-kube-api-access-jsfck\") pod \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\" (UID: \"3295d3b7-d5b3-43f4-af04-577ad46bebe0\") " Apr 17 11:25:16.806361 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.806317 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3295d3b7-d5b3-43f4-af04-577ad46bebe0-kube-api-access-jsfck" (OuterVolumeSpecName: "kube-api-access-jsfck") pod "3295d3b7-d5b3-43f4-af04-577ad46bebe0" (UID: "3295d3b7-d5b3-43f4-af04-577ad46bebe0"). InnerVolumeSpecName "kube-api-access-jsfck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:25:16.806474 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.806310 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3295d3b7-d5b3-43f4-af04-577ad46bebe0-cert" (OuterVolumeSpecName: "cert") pod "3295d3b7-d5b3-43f4-af04-577ad46bebe0" (UID: "3295d3b7-d5b3-43f4-af04-577ad46bebe0"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:25:16.810449 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.810424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:16.904965 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.904924 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3295d3b7-d5b3-43f4-af04-577ad46bebe0-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:25:16.904965 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.904959 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jsfck\" (UniqueName: \"kubernetes.io/projected/3295d3b7-d5b3-43f4-af04-577ad46bebe0-kube-api-access-jsfck\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:25:16.940672 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:16.940644 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-t6wc6"] Apr 17 11:25:16.942740 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:25:16.942703 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ada3df_f067_4273_8df1_d131da59f289.slice/crio-e2faa79c5b9a3b2189ea7a3cc19ebe7812e21ac38f2efc4ce7443ca09bb4dea4 WatchSource:0}: Error finding container e2faa79c5b9a3b2189ea7a3cc19ebe7812e21ac38f2efc4ce7443ca09bb4dea4: Status 404 returned error can't find the container with id e2faa79c5b9a3b2189ea7a3cc19ebe7812e21ac38f2efc4ce7443ca09bb4dea4 Apr 17 11:25:17.155297 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.155259 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" event={"ID":"55ada3df-f067-4273-8df1-d131da59f289","Type":"ContainerStarted","Data":"e2faa79c5b9a3b2189ea7a3cc19ebe7812e21ac38f2efc4ce7443ca09bb4dea4"} Apr 17 11:25:17.156517 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.156495 2575 generic.go:358] "Generic (PLEG): container finished" podID="3295d3b7-d5b3-43f4-af04-577ad46bebe0" containerID="e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9" exitCode=0 Apr 17 11:25:17.156674 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.156563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" event={"ID":"3295d3b7-d5b3-43f4-af04-577ad46bebe0","Type":"ContainerDied","Data":"e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9"} Apr 17 11:25:17.156674 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.156567 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" Apr 17 11:25:17.156674 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.156589 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-vlhpc" event={"ID":"3295d3b7-d5b3-43f4-af04-577ad46bebe0","Type":"ContainerDied","Data":"1e12feb37d7360873063f955ae860162204e61e865e5caf2ea6df55eed071875"} Apr 17 11:25:17.156674 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.156608 2575 scope.go:117] "RemoveContainer" containerID="e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9" Apr 17 11:25:17.165621 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.165600 2575 scope.go:117] "RemoveContainer" containerID="e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9" Apr 17 11:25:17.165927 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:25:17.165906 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9\": container with ID starting with e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9 not found: ID does not exist" containerID="e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9" Apr 17 11:25:17.165970 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.165938 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9"} err="failed to get container status \"e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9\": rpc error: code = NotFound desc = could not find container \"e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9\": container with ID starting with e1a44207b5f8b1b3b5deb1217941ad82f1107d1ab8577836b52cb704f61aaed9 not found: ID does not exist" Apr 17 11:25:17.187995 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.187950 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-vlhpc"] Apr 17 11:25:17.195813 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.195779 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7dcb9f9f85-vlhpc"] Apr 17 11:25:17.342496 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:17.342461 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3295d3b7-d5b3-43f4-af04-577ad46bebe0" path="/var/lib/kubelet/pods/3295d3b7-d5b3-43f4-af04-577ad46bebe0/volumes" Apr 17 11:25:18.160777 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:18.160737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" event={"ID":"55ada3df-f067-4273-8df1-d131da59f289","Type":"ContainerStarted","Data":"56d792865ec33cb629a78cc15c5d5e4652667f97ec0524f05910fa2191d46d73"} Apr 17 11:25:18.161249 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:18.160842 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:18.183458 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:18.183390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" podStartSLOduration=1.742089403 podStartE2EDuration="2.183369686s" podCreationTimestamp="2026-04-17 11:25:16 +0000 UTC" firstStartedPulling="2026-04-17 11:25:16.943996876 +0000 UTC m=+546.113803922" lastFinishedPulling="2026-04-17 11:25:17.385277162 +0000 UTC m=+546.555084205" observedRunningTime="2026-04-17 11:25:18.18071739 +0000 UTC m=+547.350524454" watchObservedRunningTime="2026-04-17 11:25:18.183369686 +0000 UTC m=+547.353176753" Apr 17 11:25:49.169838 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:49.169806 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7dcb9f9f85-t6wc6" Apr 17 11:25:50.022208 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.022172 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tqpng"] Apr 17 11:25:50.022775 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.022759 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3295d3b7-d5b3-43f4-af04-577ad46bebe0" containerName="manager" Apr 17 11:25:50.022824 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.022779 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3295d3b7-d5b3-43f4-af04-577ad46bebe0" containerName="manager" Apr 17 11:25:50.022885 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.022873 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3295d3b7-d5b3-43f4-af04-577ad46bebe0" containerName="manager" Apr 17 11:25:50.025861 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.025842 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.029204 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.029178 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 11:25:50.029366 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.029223 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-v5vk5\"" Apr 17 11:25:50.042454 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.042421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tqpng"] Apr 17 11:25:50.081957 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.081913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c7d3bb-efc3-47fb-a2c0-490948c41035-tls-certs\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.082168 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.081967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7n7t\" (UniqueName: \"kubernetes.io/projected/c3c7d3bb-efc3-47fb-a2c0-490948c41035-kube-api-access-x7n7t\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.183087 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.183043 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c7d3bb-efc3-47fb-a2c0-490948c41035-tls-certs\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.183603 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.183101 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7n7t\" (UniqueName: \"kubernetes.io/projected/c3c7d3bb-efc3-47fb-a2c0-490948c41035-kube-api-access-x7n7t\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.183603 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:25:50.183218 2575 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 17 11:25:50.183603 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:25:50.183314 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c7d3bb-efc3-47fb-a2c0-490948c41035-tls-certs podName:c3c7d3bb-efc3-47fb-a2c0-490948c41035 nodeName:}" failed. No retries permitted until 2026-04-17 11:25:50.683293073 +0000 UTC m=+579.853100128 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c3c7d3bb-efc3-47fb-a2c0-490948c41035-tls-certs") pod "model-serving-api-86f7b4b499-tqpng" (UID: "c3c7d3bb-efc3-47fb-a2c0-490948c41035") : secret "model-serving-api-tls" not found Apr 17 11:25:50.198656 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.198630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7n7t\" (UniqueName: \"kubernetes.io/projected/c3c7d3bb-efc3-47fb-a2c0-490948c41035-kube-api-access-x7n7t\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.687013 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.686970 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c7d3bb-efc3-47fb-a2c0-490948c41035-tls-certs\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.689416 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.689393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3c7d3bb-efc3-47fb-a2c0-490948c41035-tls-certs\") pod \"model-serving-api-86f7b4b499-tqpng\" (UID: \"c3c7d3bb-efc3-47fb-a2c0-490948c41035\") " pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:50.936703 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:50.936664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:51.067386 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:51.067361 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tqpng"] Apr 17 11:25:51.069786 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:25:51.069755 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c7d3bb_efc3_47fb_a2c0_490948c41035.slice/crio-7249550ff7bb6096d205817c12d230678a0d2f38a7b17266799ca866ee32f215 WatchSource:0}: Error finding container 7249550ff7bb6096d205817c12d230678a0d2f38a7b17266799ca866ee32f215: Status 404 returned error can't find the container with id 7249550ff7bb6096d205817c12d230678a0d2f38a7b17266799ca866ee32f215 Apr 17 11:25:51.275666 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:51.275580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tqpng" event={"ID":"c3c7d3bb-efc3-47fb-a2c0-490948c41035","Type":"ContainerStarted","Data":"7249550ff7bb6096d205817c12d230678a0d2f38a7b17266799ca866ee32f215"} Apr 17 11:25:53.283774 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:53.283737 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tqpng" event={"ID":"c3c7d3bb-efc3-47fb-a2c0-490948c41035","Type":"ContainerStarted","Data":"ba4ef75bea82a76d455955aa663bbeeeba363ade749e871db4ce8b4f662885de"} Apr 17 11:25:53.284254 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:53.283858 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:25:53.301406 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:53.301323 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tqpng" podStartSLOduration=3.041853049 podStartE2EDuration="4.301306038s" podCreationTimestamp="2026-04-17 11:25:49 +0000 UTC" firstStartedPulling="2026-04-17 11:25:51.071444488 +0000 UTC m=+580.241251531" lastFinishedPulling="2026-04-17 11:25:52.330897473 +0000 UTC m=+581.500704520" observedRunningTime="2026-04-17 11:25:53.300814272 +0000 UTC m=+582.470621337" watchObservedRunningTime="2026-04-17 11:25:53.301306038 +0000 UTC m=+582.471113104" Apr 17 11:25:54.942116 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:54.942067 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b6fd9dfbf-4qdp8"] Apr 17 11:25:54.945768 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:54.945735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:54.958953 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:54.958922 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b6fd9dfbf-4qdp8"] Apr 17 11:25:55.026943 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.026908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-service-ca\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.027120 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.026953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-trusted-ca-bundle\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.027120 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.026984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-oauth-config\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.027120 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.027011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-oauth-serving-cert\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.027120 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.027038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-config\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.027120 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.027062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvnj5\" (UniqueName: \"kubernetes.io/projected/851d5c9b-b431-4a97-b3d1-87420575c9ad-kube-api-access-wvnj5\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.027279 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.027146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-serving-cert\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.127806 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.127768 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-serving-cert\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.127989 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.127819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-service-ca\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.127989 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.127946 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-trusted-ca-bundle\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.128062 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.127998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-oauth-config\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.128102 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.128058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-oauth-serving-cert\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.128136 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.128113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-config\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.128169 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.128148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvnj5\" (UniqueName: \"kubernetes.io/projected/851d5c9b-b431-4a97-b3d1-87420575c9ad-kube-api-access-wvnj5\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.128723 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.128696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-service-ca\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.128920 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.128895 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-oauth-serving-cert\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.129002 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.128917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-config\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.129175 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.129153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/851d5c9b-b431-4a97-b3d1-87420575c9ad-trusted-ca-bundle\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.130464 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.130439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-oauth-config\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.130640 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.130624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/851d5c9b-b431-4a97-b3d1-87420575c9ad-console-serving-cert\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.137310 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.137272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvnj5\" (UniqueName: \"kubernetes.io/projected/851d5c9b-b431-4a97-b3d1-87420575c9ad-kube-api-access-wvnj5\") pod \"console-7b6fd9dfbf-4qdp8\" (UID: \"851d5c9b-b431-4a97-b3d1-87420575c9ad\") " pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.256824 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.256729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:25:55.394867 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:55.394838 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b6fd9dfbf-4qdp8"] Apr 17 11:25:55.396800 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:25:55.396763 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851d5c9b_b431_4a97_b3d1_87420575c9ad.slice/crio-fe0e62dd23bf08c8803477d3d56ba737e735123560b9be2279ba6a2336ea276a WatchSource:0}: Error finding container fe0e62dd23bf08c8803477d3d56ba737e735123560b9be2279ba6a2336ea276a: Status 404 returned error can't find the container with id fe0e62dd23bf08c8803477d3d56ba737e735123560b9be2279ba6a2336ea276a Apr 17 11:25:56.295688 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:56.295654 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b6fd9dfbf-4qdp8" event={"ID":"851d5c9b-b431-4a97-b3d1-87420575c9ad","Type":"ContainerStarted","Data":"d6c3b0ea230f405fa1809f75c6635128714f12d07b8b6c8469434838efcf5f25"} Apr 17 11:25:56.295688 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:56.295692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b6fd9dfbf-4qdp8" event={"ID":"851d5c9b-b431-4a97-b3d1-87420575c9ad","Type":"ContainerStarted","Data":"fe0e62dd23bf08c8803477d3d56ba737e735123560b9be2279ba6a2336ea276a"} Apr 17 11:25:56.315745 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:25:56.315688 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b6fd9dfbf-4qdp8" podStartSLOduration=2.315672445 podStartE2EDuration="2.315672445s" podCreationTimestamp="2026-04-17 11:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:25:56.31465186 +0000 UTC m=+585.484458949" watchObservedRunningTime="2026-04-17 11:25:56.315672445 +0000 UTC m=+585.485479509" Apr 17 11:26:04.291710 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:04.291682 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tqpng" Apr 17 11:26:05.257954 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:05.257913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:26:05.257954 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:05.257961 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:26:05.263022 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:05.262995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:26:05.331281 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:05.331253 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b6fd9dfbf-4qdp8" Apr 17 11:26:05.378535 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:05.378495 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b955bd79c-sp68z"] Apr 17 11:26:11.261123 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:11.261089 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:26:11.262225 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:11.262205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:26:11.268359 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:11.268322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:26:11.269456 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:11.269436 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:26:30.403438 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.403315 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b955bd79c-sp68z" podUID="f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" containerName="console" containerID="cri-o://1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5" gracePeriod=15 Apr 17 11:26:30.645781 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.645753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b955bd79c-sp68z_f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c/console/0.log" Apr 17 11:26:30.645917 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.645817 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:26:30.742011 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.741908 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-service-ca\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742011 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.741977 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-oauth-config\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742259 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742052 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfz5b\" (UniqueName: \"kubernetes.io/projected/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-kube-api-access-gfz5b\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742259 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742084 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-oauth-serving-cert\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742259 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742131 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-serving-cert\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742259 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742159 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-config\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742259 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742185 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-trusted-ca-bundle\") pod \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\" (UID: \"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c\") " Apr 17 11:26:30.742539 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742388 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-service-ca" (OuterVolumeSpecName: "service-ca") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:30.742601 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742569 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:30.742678 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742590 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-config" (OuterVolumeSpecName: "console-config") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:30.742794 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.742765 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 11:26:30.744395 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.744365 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:30.744395 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.744373 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-kube-api-access-gfz5b" (OuterVolumeSpecName: "kube-api-access-gfz5b") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "kube-api-access-gfz5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 11:26:30.744553 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.744418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" (UID: "f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 11:26:30.842907 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842864 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gfz5b\" (UniqueName: \"kubernetes.io/projected/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-kube-api-access-gfz5b\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:30.842907 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842899 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-oauth-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:30.842907 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842909 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-serving-cert\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:30.842907 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842919 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:30.843234 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842928 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-trusted-ca-bundle\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:30.843234 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842938 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-service-ca\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:30.843234 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:30.842947 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c-console-oauth-config\") on node \"ip-10-0-135-81.ec2.internal\" DevicePath \"\"" Apr 17 11:26:31.424622 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.424592 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b955bd79c-sp68z_f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c/console/0.log" Apr 17 11:26:31.425032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.424635 2575 generic.go:358] "Generic (PLEG): container finished" podID="f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" containerID="1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5" exitCode=2 Apr 17 11:26:31.425032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.424667 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b955bd79c-sp68z" event={"ID":"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c","Type":"ContainerDied","Data":"1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5"} Apr 17 11:26:31.425032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.424704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b955bd79c-sp68z" event={"ID":"f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c","Type":"ContainerDied","Data":"a277d0f969b085da9648ef6174752c939332de1660329a1c19372a289cf319af"} Apr 17 11:26:31.425032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.424722 2575 scope.go:117] "RemoveContainer" containerID="1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5" Apr 17 11:26:31.425032 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.424723 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b955bd79c-sp68z" Apr 17 11:26:31.433255 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.433238 2575 scope.go:117] "RemoveContainer" containerID="1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5" Apr 17 11:26:31.433532 ip-10-0-135-81 kubenswrapper[2575]: E0417 11:26:31.433511 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5\": container with ID starting with 1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5 not found: ID does not exist" containerID="1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5" Apr 17 11:26:31.433616 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.433538 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5"} err="failed to get container status \"1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5\": rpc error: code = NotFound desc = could not find container \"1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5\": container with ID starting with 1b23fa36849e48302763c406ded9242c0ff86e1990cc7b1920252b9a83de99e5 not found: ID does not exist" Apr 17 11:26:31.440988 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.440959 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b955bd79c-sp68z"] Apr 17 11:26:31.444562 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:31.444538 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b955bd79c-sp68z"] Apr 17 11:26:33.343002 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:33.342968 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" path="/var/lib/kubelet/pods/f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c/volumes" Apr 17 11:26:46.003470 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.003427 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6ck/must-gather-pv4sr"] Apr 17 11:26:46.003868 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.003794 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" containerName="console" Apr 17 11:26:46.003868 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.003804 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" containerName="console" Apr 17 11:26:46.003868 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.003863 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5c35ad8-cad3-4bf1-81e2-8eb6926d8b2c" containerName="console" Apr 17 11:26:46.007148 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.007125 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.009431 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.009401 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vp6ck\"/\"default-dockercfg-hjq8z\"" Apr 17 11:26:46.009589 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.009568 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vp6ck\"/\"openshift-service-ca.crt\"" Apr 17 11:26:46.009700 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.009681 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vp6ck\"/\"kube-root-ca.crt\"" Apr 17 11:26:46.013122 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.013097 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/must-gather-pv4sr"] Apr 17 11:26:46.077713 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.077669 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd465324-0f39-4703-b65a-5245f22070c7-must-gather-output\") pod \"must-gather-pv4sr\" (UID: \"cd465324-0f39-4703-b65a-5245f22070c7\") " pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.077713 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.077717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv6gw\" (UniqueName: \"kubernetes.io/projected/cd465324-0f39-4703-b65a-5245f22070c7-kube-api-access-lv6gw\") pod \"must-gather-pv4sr\" (UID: \"cd465324-0f39-4703-b65a-5245f22070c7\") " pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.178297 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.178251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd465324-0f39-4703-b65a-5245f22070c7-must-gather-output\") pod \"must-gather-pv4sr\" (UID: \"cd465324-0f39-4703-b65a-5245f22070c7\") " pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.178297 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.178302 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lv6gw\" (UniqueName: \"kubernetes.io/projected/cd465324-0f39-4703-b65a-5245f22070c7-kube-api-access-lv6gw\") pod \"must-gather-pv4sr\" (UID: \"cd465324-0f39-4703-b65a-5245f22070c7\") " pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.178662 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.178641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cd465324-0f39-4703-b65a-5245f22070c7-must-gather-output\") pod \"must-gather-pv4sr\" (UID: \"cd465324-0f39-4703-b65a-5245f22070c7\") " pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.186297 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.186271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv6gw\" (UniqueName: \"kubernetes.io/projected/cd465324-0f39-4703-b65a-5245f22070c7-kube-api-access-lv6gw\") pod \"must-gather-pv4sr\" (UID: \"cd465324-0f39-4703-b65a-5245f22070c7\") " pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.324414 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.324375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/must-gather-pv4sr" Apr 17 11:26:46.452049 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.452023 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/must-gather-pv4sr"] Apr 17 11:26:46.454294 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:26:46.454260 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd465324_0f39_4703_b65a_5245f22070c7.slice/crio-77c4736a8d69e6d397f4b587d9038505ec17d0d34069a10e98f6d7aa1aa542b0 WatchSource:0}: Error finding container 77c4736a8d69e6d397f4b587d9038505ec17d0d34069a10e98f6d7aa1aa542b0: Status 404 returned error can't find the container with id 77c4736a8d69e6d397f4b587d9038505ec17d0d34069a10e98f6d7aa1aa542b0 Apr 17 11:26:46.482381 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:46.482299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/must-gather-pv4sr" event={"ID":"cd465324-0f39-4703-b65a-5245f22070c7","Type":"ContainerStarted","Data":"77c4736a8d69e6d397f4b587d9038505ec17d0d34069a10e98f6d7aa1aa542b0"} Apr 17 11:26:47.489707 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:47.489203 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/must-gather-pv4sr" event={"ID":"cd465324-0f39-4703-b65a-5245f22070c7","Type":"ContainerStarted","Data":"0272391d9961e13bb16924e0a5cce7ad63909ec59c1d981692afbc444bfd5f36"} Apr 17 11:26:47.489707 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:47.489249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/must-gather-pv4sr" event={"ID":"cd465324-0f39-4703-b65a-5245f22070c7","Type":"ContainerStarted","Data":"c53bb4e37d1810d704c9f51c66c78c96af5a8d873f80d945070e91bc697c9f81"} Apr 17 11:26:47.504637 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:47.504574 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vp6ck/must-gather-pv4sr" podStartSLOduration=1.747421846 podStartE2EDuration="2.504555276s" podCreationTimestamp="2026-04-17 11:26:45 +0000 UTC" firstStartedPulling="2026-04-17 11:26:46.456090214 +0000 UTC m=+635.625897260" lastFinishedPulling="2026-04-17 11:26:47.213223629 +0000 UTC m=+636.383030690" observedRunningTime="2026-04-17 11:26:47.503879665 +0000 UTC m=+636.673686732" watchObservedRunningTime="2026-04-17 11:26:47.504555276 +0000 UTC m=+636.674362341" Apr 17 11:26:48.760085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:48.760049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hddsl_8adcdab8-9195-4dbb-838d-3ac5065f81ed/global-pull-secret-syncer/0.log" Apr 17 11:26:48.978495 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:48.978465 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t7kcp_f6877aa1-7b7b-4025-b8e9-f96f15bfab82/konnectivity-agent/0.log" Apr 17 11:26:49.000853 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:49.000812 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-81.ec2.internal_94b45018193276143fc473b8ac9f9152/haproxy/0.log" Apr 17 11:26:52.228085 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.228054 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/alertmanager/0.log" Apr 17 11:26:52.256466 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.256439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/config-reloader/0.log" Apr 17 11:26:52.285091 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.285058 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/kube-rbac-proxy-web/0.log" Apr 17 11:26:52.315554 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.315528 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/kube-rbac-proxy/0.log" Apr 17 11:26:52.342322 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.342287 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/kube-rbac-proxy-metric/0.log" Apr 17 11:26:52.370093 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.370068 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/prom-label-proxy/0.log" Apr 17 11:26:52.399740 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.399709 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8a62de1e-19c9-42d2-b22e-7772bf535278/init-config-reloader/0.log" Apr 17 11:26:52.441790 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.441756 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-6mrs8_ba013853-d529-46b6-84d3-0e259d87af73/cluster-monitoring-operator/0.log" Apr 17 11:26:52.471152 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.471109 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jlcpd_687c2dc1-3d57-43d9-ad53-91df7e933033/kube-state-metrics/0.log" Apr 17 11:26:52.495415 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.495325 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jlcpd_687c2dc1-3d57-43d9-ad53-91df7e933033/kube-rbac-proxy-main/0.log" Apr 17 11:26:52.520589 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.520549 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-jlcpd_687c2dc1-3d57-43d9-ad53-91df7e933033/kube-rbac-proxy-self/0.log" Apr 17 11:26:52.782043 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.781946 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pdgxw_35d31261-af18-42f2-8ad4-20563e06ef13/node-exporter/0.log" Apr 17 11:26:52.807963 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.807933 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pdgxw_35d31261-af18-42f2-8ad4-20563e06ef13/kube-rbac-proxy/0.log" Apr 17 11:26:52.831400 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.831329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pdgxw_35d31261-af18-42f2-8ad4-20563e06ef13/init-textfile/0.log" Apr 17 11:26:52.858968 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.858931 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gl5p2_39dab659-2408-4839-a217-e44018fe6d68/kube-rbac-proxy-main/0.log" Apr 17 11:26:52.883703 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.883662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gl5p2_39dab659-2408-4839-a217-e44018fe6d68/kube-rbac-proxy-self/0.log" Apr 17 11:26:52.907561 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:52.907533 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gl5p2_39dab659-2408-4839-a217-e44018fe6d68/openshift-state-metrics/0.log" Apr 17 11:26:53.303301 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:53.303263 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-n6kjr_4e57b4d6-c19a-42b9-a145-d67b092f51aa/prometheus-operator-admission-webhook/0.log" Apr 17 11:26:53.335762 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:53.335721 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c5d4cd4-kvnpp_65ddecc9-e83c-4ead-b118-2a8d4c960974/telemeter-client/0.log" Apr 17 11:26:53.362197 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:53.362166 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c5d4cd4-kvnpp_65ddecc9-e83c-4ead-b118-2a8d4c960974/reload/0.log" Apr 17 11:26:53.389757 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:53.389726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c5d4cd4-kvnpp_65ddecc9-e83c-4ead-b118-2a8d4c960974/kube-rbac-proxy/0.log" Apr 17 11:26:55.321554 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:55.321517 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/2.log" Apr 17 11:26:55.331091 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:55.331049 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-gcrf7_3932e45a-3ab6-40aa-8c2b-48214229c367/console-operator/3.log" Apr 17 11:26:55.768769 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:55.768731 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b6fd9dfbf-4qdp8_851d5c9b-b431-4a97-b3d1-87420575c9ad/console/0.log" Apr 17 11:26:55.804134 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:55.804103 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-vdnzk_b3744711-cf48-4517-ae8f-f049ff2343f8/download-server/0.log" Apr 17 11:26:56.063208 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.063122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f"] Apr 17 11:26:56.068173 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.068144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.097844 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.097809 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f"] Apr 17 11:26:56.178041 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.178003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-lib-modules\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.178396 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.178369 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-podres\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.178552 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.178535 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jmv\" (UniqueName: \"kubernetes.io/projected/7cb78a2f-88c3-4749-85a2-1e27ffc59417-kube-api-access-s2jmv\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.178667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.178655 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-proc\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.178774 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.178762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-sys\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280090 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-sys\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280260 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-lib-modules\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280260 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-sys\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280260 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-podres\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280399 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jmv\" (UniqueName: \"kubernetes.io/projected/7cb78a2f-88c3-4749-85a2-1e27ffc59417-kube-api-access-s2jmv\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280399 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-lib-modules\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280399 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-proc\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280399 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-proc\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.280399 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.280293 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7cb78a2f-88c3-4749-85a2-1e27ffc59417-podres\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.288353 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.288319 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jmv\" (UniqueName: \"kubernetes.io/projected/7cb78a2f-88c3-4749-85a2-1e27ffc59417-kube-api-access-s2jmv\") pod \"perf-node-gather-daemonset-qtj9f\" (UID: \"7cb78a2f-88c3-4749-85a2-1e27ffc59417\") " pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.304252 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.304223 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5m8z8_a2090436-524c-47c4-ac0f-8b94ceec083d/volume-data-source-validator/0.log" Apr 17 11:26:56.380228 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.380195 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:56.524864 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:56.524836 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f"] Apr 17 11:26:56.529499 ip-10-0-135-81 kubenswrapper[2575]: W0417 11:26:56.529460 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7cb78a2f_88c3_4749_85a2_1e27ffc59417.slice/crio-2b0425b4c0baf93eb48ec0e54a08cf274f2763637f40c42ba979dd3fb3494486 WatchSource:0}: Error finding container 2b0425b4c0baf93eb48ec0e54a08cf274f2763637f40c42ba979dd3fb3494486: Status 404 returned error can't find the container with id 2b0425b4c0baf93eb48ec0e54a08cf274f2763637f40c42ba979dd3fb3494486 Apr 17 11:26:57.102918 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.102891 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ngx5g_d4bb7b6c-7fd2-4a72-be8b-724128cbea39/dns/0.log" Apr 17 11:26:57.129845 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.129809 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ngx5g_d4bb7b6c-7fd2-4a72-be8b-724128cbea39/kube-rbac-proxy/0.log" Apr 17 11:26:57.351869 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.351841 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4r4xj_ec79fc9e-eb92-4d6d-9ea6-2a309575b035/dns-node-resolver/0.log" Apr 17 11:26:57.542715 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.542624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" event={"ID":"7cb78a2f-88c3-4749-85a2-1e27ffc59417","Type":"ContainerStarted","Data":"b04df33caac28bb880b5865884901fa1b77ec26d7d7ad91df5d481935fd1e815"} Apr 17 11:26:57.543223 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.543197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" event={"ID":"7cb78a2f-88c3-4749-85a2-1e27ffc59417","Type":"ContainerStarted","Data":"2b0425b4c0baf93eb48ec0e54a08cf274f2763637f40c42ba979dd3fb3494486"} Apr 17 11:26:57.543301 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.543245 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:26:57.562249 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.562199 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" podStartSLOduration=1.5621823080000001 podStartE2EDuration="1.562182308s" podCreationTimestamp="2026-04-17 11:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 11:26:57.560739127 +0000 UTC m=+646.730546203" watchObservedRunningTime="2026-04-17 11:26:57.562182308 +0000 UTC m=+646.731989372" Apr 17 11:26:57.973724 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:57.973686 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pms2x_8bc04c84-5033-4522-bdb4-8ff714571072/node-ca/0.log" Apr 17 11:26:59.284644 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:59.284614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-7dssr_3d82faa5-4dae-416a-8e2c-337d3966cdd0/serve-healthcheck-canary/0.log" Apr 17 11:26:59.808053 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:59.808022 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k76l5_357b4ca1-1680-47a2-96cf-40460312708f/insights-operator/0.log" Apr 17 11:26:59.809898 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:59.809869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-k76l5_357b4ca1-1680-47a2-96cf-40460312708f/insights-operator/1.log" Apr 17 11:26:59.833712 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:59.833682 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gmpm_f9114e81-c2b0-41e4-9e6c-72b4f1198507/kube-rbac-proxy/0.log" Apr 17 11:26:59.860789 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:59.860759 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gmpm_f9114e81-c2b0-41e4-9e6c-72b4f1198507/exporter/0.log" Apr 17 11:26:59.887842 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:26:59.887806 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8gmpm_f9114e81-c2b0-41e4-9e6c-72b4f1198507/extractor/0.log" Apr 17 11:27:02.057600 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:02.057559 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7dcb9f9f85-t6wc6_55ada3df-f067-4273-8df1-d131da59f289/manager/0.log" Apr 17 11:27:02.111474 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:02.111437 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tqpng_c3c7d3bb-efc3-47fb-a2c0-490948c41035/server/0.log" Apr 17 11:27:04.564789 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:04.564090 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-vp6ck/perf-node-gather-daemonset-qtj9f" Apr 17 11:27:06.787210 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:06.787127 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s55l8_06179da3-f0cd-4bd1-8c19-8e6e7e41a7be/kube-storage-version-migrator-operator/1.log" Apr 17 11:27:06.788940 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:06.788902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-s55l8_06179da3-f0cd-4bd1-8c19-8e6e7e41a7be/kube-storage-version-migrator-operator/0.log" Apr 17 11:27:07.891921 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:07.891892 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/kube-multus-additional-cni-plugins/0.log" Apr 17 11:27:07.923798 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:07.923767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/egress-router-binary-copy/0.log" Apr 17 11:27:07.953048 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:07.953011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/cni-plugins/0.log" Apr 17 11:27:07.979060 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:07.979032 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/bond-cni-plugin/0.log" Apr 17 11:27:08.003099 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:08.003047 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/routeoverride-cni/0.log" Apr 17 11:27:08.027292 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:08.027262 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/whereabouts-cni-bincopy/0.log" Apr 17 11:27:08.053948 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:08.053916 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4w8lp_b1d77920-3c64-40cf-82ce-24b1244a48e0/whereabouts-cni/0.log" Apr 17 11:27:08.523532 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:08.523501 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcwz5_7f0f3aa1-28b4-49b7-8498-04ccddc9bacf/kube-multus/0.log" Apr 17 11:27:08.649370 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:08.649318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tvn9d_d129dd20-5a5b-4718-8eca-2f10184defe9/network-metrics-daemon/0.log" Apr 17 11:27:08.672667 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:08.672641 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-tvn9d_d129dd20-5a5b-4718-8eca-2f10184defe9/kube-rbac-proxy/0.log" Apr 17 11:27:09.790066 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.790032 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-controller/0.log" Apr 17 11:27:09.808704 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.808670 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/0.log" Apr 17 11:27:09.814822 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.814796 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovn-acl-logging/1.log" Apr 17 11:27:09.843734 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.843703 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/kube-rbac-proxy-node/0.log" Apr 17 11:27:09.874631 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.874601 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:27:09.893323 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.893290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/northd/0.log" Apr 17 11:27:09.919903 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.919865 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/nbdb/0.log" Apr 17 11:27:09.982136 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:09.982097 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/sbdb/0.log" Apr 17 11:27:10.186857 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:10.186827 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bvdth_652320e1-a7a1-4b18-a16c-59420fde1a03/ovnkube-controller/0.log" Apr 17 11:27:11.683066 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:11.683032 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vqr58_5de58c93-9df1-4e96-8236-6bcce80c59c7/check-endpoints/0.log" Apr 17 11:27:11.735662 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:11.735633 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-cvt8g_055f933b-358b-4058-aa0d-4808293e4549/network-check-target-container/0.log" Apr 17 11:27:12.774768 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:12.774737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s9dkv_541df6c8-cc79-40aa-9b07-2084d74abdbd/iptables-alerter/0.log" Apr 17 11:27:13.432891 ip-10-0-135-81 kubenswrapper[2575]: I0417 11:27:13.432848 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-f28wr_12cc2cb3-5799-482e-9110-985521fc52eb/tuned/0.log"