Apr 20 07:00:28.493735 ip-10-0-138-178 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 07:00:28.493749 ip-10-0-138-178 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 07:00:28.493756 ip-10-0-138-178 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 07:00:28.494107 ip-10-0-138-178 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 07:00:38.699868 ip-10-0-138-178 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 07:00:38.699882 ip-10-0-138-178 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2f9dc35ec1b24cee8c45ae0418131ed6 -- Apr 20 07:02:57.002658 ip-10-0-138-178 systemd[1]: Starting Kubernetes Kubelet... Apr 20 07:02:57.456998 ip-10-0-138-178 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:57.456998 ip-10-0-138-178 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 07:02:57.456998 ip-10-0-138-178 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:57.456998 ip-10-0-138-178 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 07:02:57.456998 ip-10-0-138-178 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:57.459688 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.459604 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 07:02:57.462528 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462513 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:57.462528 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462529 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462533 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462537 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462541 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462543 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462547 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462549 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462556 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462559 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462562 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462565 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462567 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462570 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462572 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462575 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462578 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462582 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462586 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462589 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:57.462589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462592 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462595 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462598 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462601 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462604 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462607 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462610 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462613 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462616 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462619 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462622 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462625 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462628 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462631 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462633 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462636 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462638 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462641 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462643 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462645 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:57.463031 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462648 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462650 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462653 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462655 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462657 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462661 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462663 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462666 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462668 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462671 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462674 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462678 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462681 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462683 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462687 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462689 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462692 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462694 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462697 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462700 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:57.463523 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462702 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462705 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462711 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462714 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462717 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462719 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462722 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462724 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462727 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462730 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462732 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462735 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462737 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462739 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462742 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462744 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462747 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462749 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462751 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462754 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:57.464002 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462757 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462759 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462762 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462764 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462767 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.462770 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463152 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463157 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463160 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463163 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463165 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463168 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463171 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463173 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463176 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463179 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463182 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463185 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463188 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:57.464510 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463190 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463193 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463195 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463198 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463200 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463203 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463205 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463208 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463210 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463213 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463215 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463218 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463220 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463222 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463225 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463228 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463230 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463233 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463236 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:57.465085 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463239 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463242 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463244 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463247 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463249 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463252 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463254 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463256 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463259 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463261 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463264 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463269 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463272 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463274 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463277 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463279 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463282 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463284 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463286 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463289 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:57.465737 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463291 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463294 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463296 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463299 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463301 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463304 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463306 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463309 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463311 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463313 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463333 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463336 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463338 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463342 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463345 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463347 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463350 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463353 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463355 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:57.466235 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463358 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463360 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463364 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463368 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463371 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463375 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463377 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463381 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463386 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463389 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463393 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463395 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463398 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463401 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.463403 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464120 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464129 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464135 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464140 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464144 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464147 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 07:02:57.466720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464151 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464157 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464160 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464163 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464166 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464169 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464172 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464175 2577 flags.go:64] FLAG: --cgroup-root="" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464178 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464181 2577 flags.go:64] FLAG: --client-ca-file="" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464183 2577 flags.go:64] FLAG: --cloud-config="" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464186 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464189 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464193 2577 flags.go:64] FLAG: --cluster-domain="" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464196 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464199 2577 flags.go:64] FLAG: --config-dir="" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464201 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464205 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464209 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464212 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464215 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464219 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464222 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464225 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 07:02:57.467222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464227 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464231 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464234 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464238 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464241 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464244 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464247 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464250 2577 flags.go:64] FLAG: --enable-server="true" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464253 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464257 2577 flags.go:64] FLAG: --event-burst="100" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464261 2577 flags.go:64] FLAG: --event-qps="50" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464263 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464267 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464273 2577 flags.go:64] FLAG: --eviction-hard="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464277 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464280 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464282 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464286 2577 flags.go:64] FLAG: --eviction-soft="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464288 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464291 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464294 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464297 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464300 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464303 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464305 2577 flags.go:64] FLAG: --feature-gates="" Apr 20 07:02:57.467809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464309 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464313 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464330 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464333 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464336 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464340 2577 flags.go:64] FLAG: --help="false" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464342 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464345 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464348 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464351 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464354 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464358 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464361 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464363 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464366 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464369 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464371 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464375 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464377 2577 flags.go:64] FLAG: --kube-reserved="" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464380 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464384 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464387 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464390 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464393 2577 flags.go:64] FLAG: --lock-file="" Apr 20 07:02:57.468434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464396 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464398 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464404 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464410 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464413 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464415 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464418 2577 flags.go:64] FLAG: --logging-format="text" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464421 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464424 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464427 2577 flags.go:64] FLAG: --manifest-url="" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464430 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464434 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464437 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464442 2577 flags.go:64] FLAG: --max-pods="110" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464445 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464448 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464451 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464454 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464456 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464459 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464462 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464470 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464473 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464475 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 07:02:57.468995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464478 2577 flags.go:64] FLAG: --pod-cidr="" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464485 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464491 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464494 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464499 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464501 2577 flags.go:64] FLAG: --port="10250" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464504 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464507 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a437911859a5024e" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464511 2577 flags.go:64] FLAG: --qos-reserved="" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464514 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464517 2577 flags.go:64] FLAG: --register-node="true" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464520 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464522 2577 flags.go:64] FLAG: --register-with-taints="" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464526 2577 flags.go:64] FLAG: --registry-burst="10" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464529 2577 flags.go:64] FLAG: --registry-qps="5" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464532 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464535 2577 flags.go:64] FLAG: --reserved-memory="" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464539 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464542 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464545 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464548 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464550 2577 flags.go:64] FLAG: --runonce="false" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464553 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464556 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464559 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 20 07:02:57.469586 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464562 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464564 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464567 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464570 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464573 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464576 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464579 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464582 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464584 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464588 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464591 2577 flags.go:64] FLAG: --system-cgroups="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464594 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464600 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464603 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464606 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464610 2577 flags.go:64] FLAG: --tls-min-version="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464612 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464615 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464618 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464621 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464623 2577 flags.go:64] FLAG: --v="2" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464628 2577 flags.go:64] FLAG: --version="false" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464632 2577 flags.go:64] FLAG: --vmodule="" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464636 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.464639 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 07:02:57.470229 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466287 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466302 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466306 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466309 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466312 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466315 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466330 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466332 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466337 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466341 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466344 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466348 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466351 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466354 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466357 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466360 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466363 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466366 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466369 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466371 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:57.470859 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466374 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466377 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466380 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466382 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466385 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466387 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466390 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466392 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466395 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466397 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466400 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466403 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466406 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466409 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466412 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466414 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466426 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466429 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466432 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466435 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466438 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:57.471368 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466440 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466443 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466445 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466448 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466450 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466453 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466455 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466458 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466460 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466463 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466466 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466468 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466471 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466473 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466476 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466479 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466481 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466484 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466486 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466489 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:57.471892 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466491 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466494 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466497 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466499 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466504 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466507 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466509 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466512 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466514 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466517 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466519 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466522 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466525 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466527 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466530 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466534 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466538 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466543 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466546 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:57.472401 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466549 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:57.472879 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466551 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:57.472879 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466554 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:57.472879 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466556 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:57.472879 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466559 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:57.472879 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.466561 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:57.472879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.467401 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:57.474049 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.473944 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 07:02:57.474088 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.474052 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 07:02:57.474117 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474101 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:57.474117 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474106 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:57.474117 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474109 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:57.474117 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474112 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:57.474117 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474115 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:57.474117 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474118 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474121 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474123 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474126 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474129 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474131 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474134 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474137 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474140 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474143 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474145 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474148 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474150 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474153 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474155 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474158 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474162 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474165 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474167 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474170 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:57.474304 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474172 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474176 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474178 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474181 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474183 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474186 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474188 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474191 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474194 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474196 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474199 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474202 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474204 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474207 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474210 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474213 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474216 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474218 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474221 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474224 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:57.474826 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474226 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474231 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474234 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474237 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474240 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474243 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474245 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474248 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474250 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474253 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474256 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474258 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474260 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474263 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474265 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474268 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474270 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474273 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474275 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474278 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:57.475314 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474281 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474283 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474286 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474288 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474291 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474293 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474296 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474299 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474302 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474304 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474307 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474309 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474312 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474315 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474335 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474338 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474342 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474345 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474347 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:57.475817 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474350 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474353 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.474358 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474462 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474468 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474471 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474474 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474476 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474479 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474481 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474484 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474486 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474489 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474491 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474494 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:57.476281 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474496 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474498 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474501 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474503 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474507 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474512 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474514 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474517 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474520 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474522 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474524 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474527 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474529 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474532 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474534 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474537 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474539 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474541 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474544 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:57.476699 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474546 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474548 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474552 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474554 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474556 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474559 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474561 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474564 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474566 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474569 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474571 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474574 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474576 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474579 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474581 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474583 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474586 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474592 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:57.477155 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474594 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474597 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474599 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474601 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474604 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474606 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474609 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474611 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474614 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474616 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474618 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474621 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474623 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474626 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474628 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474631 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474634 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474636 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474639 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474641 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474644 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:57.477640 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474646 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474649 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474651 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474653 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474656 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474658 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474661 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474663 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474665 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474670 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474673 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474676 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474679 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474682 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:57.474685 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:57.478162 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.474689 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:57.478557 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.475386 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 07:02:57.478557 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.478476 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 07:02:57.479469 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.479458 2577 server.go:1019] "Starting client certificate rotation" Apr 20 07:02:57.479577 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.479559 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:02:57.479615 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.479602 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:02:57.504814 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.504792 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:02:57.507453 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.507431 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:02:57.523251 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.523228 2577 log.go:25] "Validated CRI v1 runtime API" Apr 20 07:02:57.528851 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.528834 2577 log.go:25] "Validated CRI v1 image API" Apr 20 07:02:57.530104 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.530084 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 07:02:57.533335 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.533286 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:02:57.535837 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.535816 2577 fs.go:135] Filesystem UUIDs: map[1cd19e4c-bc96-4857-8b28-ada6051543c0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 c1ddcc88-776c-4d11-a4e6-54e0e80ac80d:/dev/nvme0n1p4] Apr 20 07:02:57.535894 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.535838 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 07:02:57.541130 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541029 2577 manager.go:217] Machine: {Timestamp:2026-04-20 07:02:57.539787199 +0000 UTC m=+0.414512051 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093863 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c87e2c9410945fd30c06485456439 SystemUUID:ec2c87e2-c941-0945-fd30-c06485456439 BootID:2f9dc35e-c1b2-4cee-8c45-ae0418131ed6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c4:e6:f5:8c:a5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c4:e6:f5:8c:a5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ae:4c:ca:71:5b:9c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 07:02:57.541130 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541126 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 07:02:57.541233 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541207 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 07:02:57.541514 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541494 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 07:02:57.541647 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541516 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-178.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 07:02:57.541693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541659 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 07:02:57.541693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541668 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 07:02:57.541693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541681 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:02:57.541693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.541691 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:02:57.542949 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.542938 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:02:57.543050 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.543042 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 07:02:57.545822 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.545813 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 20 07:02:57.545855 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.545826 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 07:02:57.545855 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.545839 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 07:02:57.545855 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.545848 2577 kubelet.go:397] "Adding apiserver pod source" Apr 20 07:02:57.545855 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.545856 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 07:02:57.547085 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.547073 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:02:57.547142 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.547093 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:02:57.550344 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.550314 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 07:02:57.552040 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.552027 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 07:02:57.553300 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553288 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 07:02:57.553349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553306 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 07:02:57.553349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553313 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 07:02:57.553349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553333 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 07:02:57.553349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553339 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 07:02:57.553349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553345 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 07:02:57.553349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553350 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 07:02:57.553524 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553357 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 07:02:57.553524 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553364 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 07:02:57.553524 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553371 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 07:02:57.553524 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553379 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 07:02:57.553524 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.553387 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 07:02:57.554383 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.554371 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 07:02:57.554431 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.554386 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 07:02:57.555240 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.555223 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kp8rz" Apr 20 07:02:57.558150 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.558136 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 07:02:57.558228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.558176 2577 server.go:1295] "Started kubelet" Apr 20 07:02:57.558268 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.558220 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 07:02:57.558458 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.558385 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 07:02:57.558496 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.558488 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 07:02:57.559113 ip-10-0-138-178 systemd[1]: Started Kubernetes Kubelet. Apr 20 07:02:57.559518 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.559481 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 07:02:57.559518 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.559479 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-178.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 07:02:57.559645 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.559611 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-178.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 07:02:57.561029 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.561014 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 20 07:02:57.561796 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.561775 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 07:02:57.563032 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.563015 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kp8rz" Apr 20 07:02:57.565783 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.564612 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-178.ec2.internal.18a7fea878f3b7aa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-178.ec2.internal,UID:ip-10-0-138-178.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-178.ec2.internal,},FirstTimestamp:2026-04-20 07:02:57.558149034 +0000 UTC m=+0.432873887,LastTimestamp:2026-04-20 07:02:57.558149034 +0000 UTC m=+0.432873887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-178.ec2.internal,}" Apr 20 07:02:57.566934 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.566914 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 07:02:57.567432 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.567419 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 07:02:57.568351 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.568332 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 07:02:57.569225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569200 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 07:02:57.569308 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569226 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 07:02:57.569308 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.569271 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:57.569425 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569388 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 07:02:57.569961 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569944 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 20 07:02:57.569961 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569960 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 20 07:02:57.570097 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569971 2577 factory.go:55] Registering systemd factory Apr 20 07:02:57.570097 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.569992 2577 factory.go:223] Registration of the systemd container factory successfully Apr 20 07:02:57.570396 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.570381 2577 factory.go:153] Registering CRI-O factory Apr 20 07:02:57.570484 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.570399 2577 factory.go:223] Registration of the crio container factory successfully Apr 20 07:02:57.570484 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.570458 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 07:02:57.570484 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.570480 2577 factory.go:103] Registering Raw factory Apr 20 07:02:57.570624 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.570501 2577 manager.go:1196] Started watching for new ooms in manager Apr 20 07:02:57.571068 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.571050 2577 manager.go:319] Starting recovery of all containers Apr 20 07:02:57.577265 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.577245 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:57.580572 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.580548 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-178.ec2.internal\" not found" node="ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.581197 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.581179 2577 manager.go:324] Recovery completed Apr 20 07:02:57.583256 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.583180 2577 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 07:02:57.586113 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.586101 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:57.588384 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.588367 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:57.588459 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.588406 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:57.588459 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.588416 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:57.588885 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.588868 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 07:02:57.588885 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.588884 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 07:02:57.588970 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.588900 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:02:57.591113 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.591102 2577 policy_none.go:49] "None policy: Start" Apr 20 07:02:57.591167 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.591119 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 07:02:57.591167 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.591128 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 20 07:02:57.631918 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.631897 2577 manager.go:341] "Starting Device Plugin manager" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.631968 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.631990 2577 server.go:85] "Starting device plugin registration server" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.632249 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.632263 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.632345 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.632422 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.632431 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.633300 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 07:02:57.636810 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.633350 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:57.637290 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.637262 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 07:02:57.638615 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.638593 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 07:02:57.638683 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.638649 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 07:02:57.638683 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.638668 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 07:02:57.638683 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.638676 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 07:02:57.638788 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.638752 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 07:02:57.640870 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.640847 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:57.733184 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.733091 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:57.734232 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.734217 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:57.734344 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.734245 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:57.734344 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.734264 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:57.734344 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.734291 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.739372 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.739352 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal"] Apr 20 07:02:57.739466 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.739425 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:57.741335 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.741305 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:57.741433 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.741351 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:57.741433 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.741361 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:57.742494 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.742482 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:57.742646 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.742632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.742680 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.742659 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:57.743184 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.743167 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:57.743271 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.743198 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:57.743271 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.743210 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:57.743271 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.743238 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:57.743271 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.743252 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:57.743271 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.743213 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:57.744635 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.744619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.744702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.744652 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:57.745310 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.745290 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:57.745414 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.745344 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:57.745414 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.745390 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:57.746267 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.746250 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.746307 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.746278 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-178.ec2.internal\": node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:57.764951 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.764926 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-178.ec2.internal\" not found" node="ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.765576 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.765549 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:57.769234 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.769217 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-178.ec2.internal\" not found" node="ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.771958 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.771937 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/94adb6b4b2c338418efcb12d086e4f21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal\" (UID: \"94adb6b4b2c338418efcb12d086e4f21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.772045 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.771966 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94adb6b4b2c338418efcb12d086e4f21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal\" (UID: \"94adb6b4b2c338418efcb12d086e4f21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.772045 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.771988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e599c436f00e4332b7e91bed4b9b4c68-config\") pod \"kube-apiserver-proxy-ip-10-0-138-178.ec2.internal\" (UID: \"e599c436f00e4332b7e91bed4b9b4c68\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.866486 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.866449 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:57.873019 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.872989 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e599c436f00e4332b7e91bed4b9b4c68-config\") pod \"kube-apiserver-proxy-ip-10-0-138-178.ec2.internal\" (UID: \"e599c436f00e4332b7e91bed4b9b4c68\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.873123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.873029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/94adb6b4b2c338418efcb12d086e4f21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal\" (UID: \"94adb6b4b2c338418efcb12d086e4f21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.873123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.873052 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94adb6b4b2c338418efcb12d086e4f21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal\" (UID: \"94adb6b4b2c338418efcb12d086e4f21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.873123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.873097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e599c436f00e4332b7e91bed4b9b4c68-config\") pod \"kube-apiserver-proxy-ip-10-0-138-178.ec2.internal\" (UID: \"e599c436f00e4332b7e91bed4b9b4c68\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.873123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.873107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94adb6b4b2c338418efcb12d086e4f21-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal\" (UID: \"94adb6b4b2c338418efcb12d086e4f21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.873123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:57.873114 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/94adb6b4b2c338418efcb12d086e4f21-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal\" (UID: \"94adb6b4b2c338418efcb12d086e4f21\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:57.966681 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:57.966646 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:58.067510 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.067415 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:58.067510 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.067452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:58.071169 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.071150 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" Apr 20 07:02:58.168034 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.168001 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:58.268507 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.268466 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:58.368958 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.368878 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:58.464171 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.464133 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:58.469398 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.469368 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-178.ec2.internal\" not found" Apr 20 07:02:58.479797 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.479769 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 07:02:58.479939 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.479928 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:58.480003 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.479937 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:58.480003 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.479939 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:58.521886 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.521859 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:58.547019 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.546987 2577 apiserver.go:52] "Watching apiserver" Apr 20 07:02:58.556552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.556519 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 07:02:58.559703 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.559667 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-p77qf","openshift-image-registry/node-ca-956b6","openshift-multus/multus-j55n5","openshift-multus/network-metrics-daemon-xw95j","openshift-network-diagnostics/network-check-target-sq8g5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj","openshift-cluster-node-tuning-operator/tuned-2z746","openshift-multus/multus-additional-cni-plugins-j8rhj","openshift-network-operator/iptables-alerter-qgz5k","openshift-ovn-kubernetes/ovnkube-node-qvqrn"] Apr 20 07:02:58.562655 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.562632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.563998 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.563975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.565164 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.565133 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 06:57:57 +0000 UTC" deadline="2027-10-02 20:35:23.697889269 +0000 UTC" Apr 20 07:02:58.565164 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.565161 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12733h32m25.132730205s" Apr 20 07:02:58.565164 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.565152 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.566565 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.566548 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:58.566647 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.566629 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:02:58.567380 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.567363 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 07:02:58.567429 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.567371 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 07:02:58.567542 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.567528 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8kc5w\"" Apr 20 07:02:58.567594 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.567581 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:02:58.567659 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.567632 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:02:58.568840 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.568822 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 07:02:58.568927 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.568850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.568978 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.568963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.569208 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.569191 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" Apr 20 07:02:58.570044 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.570028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.571507 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.571489 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.572700 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.572685 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.576270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.576244 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 07:02:58.576670 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.576653 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 07:02:58.577026 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577003 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 07:02:58.577095 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577081 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 07:02:58.577273 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577254 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqltm\" (UniqueName: \"kubernetes.io/projected/6176a82b-f332-4179-a624-af63e267945f-kube-api-access-sqltm\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577603 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysconfig\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577635 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cad302be-8478-42d0-b082-35206647ea39-tmp\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6176a82b-f332-4179-a624-af63e267945f-multus-daemon-config\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577672 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-kubernetes\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577725 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577736 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysctl-d\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577743 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xqsng\"" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtsxf\" (UniqueName: \"kubernetes.io/projected/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-kube-api-access-dtsxf\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577789 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-k8s-cni-cncf-io\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-multus-certs\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577864 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-run\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cnibin\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577931 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcv72\" (UniqueName: \"kubernetes.io/projected/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-kube-api-access-kcv72\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:58.578065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.577989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578161 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-systemd\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578217 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578250 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-kubelet\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578264 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578286 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fwvms\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-socket-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578349 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-etc-selinux\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysctl-conf\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cni-binary-copy\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578497 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnw9t\" (UniqueName: \"kubernetes.io/projected/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-kube-api-access-cnw9t\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578591 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578708 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578728 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578738 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578810 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-device-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.578942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578739 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579069 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579081 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579241 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vkbg7\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579282 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wj9m6\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579374 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-q9tnl\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.578891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579414 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8qzzl\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579454 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579538 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-host\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579577 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-cni-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4x97x\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-netns\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhrmb\" (UniqueName: \"kubernetes.io/projected/cad302be-8478-42d0-b082-35206647ea39-kube-api-access-bhrmb\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-etc-kubernetes\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579774 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72490a6e-b0ae-4620-b092-0da4bc44739a-agent-certs\") pod \"konnectivity-agent-p77qf\" (UID: \"72490a6e-b0ae-4620-b092-0da4bc44739a\") " pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579816 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxfp\" (UniqueName: \"kubernetes.io/projected/63684cad-527f-4cc2-8192-bbf6485c0f70-kube-api-access-lzxfp\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.579879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579791 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 07:02:58.580547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579907 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-serviceca\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.580547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.579950 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-system-cni-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580000 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-cni-bin\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-cni-multus\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580504 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-sys-fs\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-modprobe-d\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cad302be-8478-42d0-b082-35206647ea39-etc-tuned\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580607 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72490a6e-b0ae-4620-b092-0da4bc44739a-konnectivity-ca\") pod \"konnectivity-agent-p77qf\" (UID: \"72490a6e-b0ae-4620-b092-0da4bc44739a\") " pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580631 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-cnibin\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580656 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-socket-dir-parent\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580673 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-hostroot\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580687 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-conf-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-lib-modules\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-var-lib-kubelet\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580739 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-os-release\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.580767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580762 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6176a82b-f332-4179-a624-af63e267945f-cni-binary-copy\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.581098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-registration-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.581098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-os-release\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.581098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580825 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-sys\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.581098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-host\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.581098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.580901 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-system-cni-dir\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.591291 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.591270 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal"] Apr 20 07:02:58.591783 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.591760 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:02:58.591873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.591845 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" Apr 20 07:02:58.603385 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.603354 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:02:58.608374 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.608352 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal"] Apr 20 07:02:58.609675 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.609657 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:02:58.621812 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.621571 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94adb6b4b2c338418efcb12d086e4f21.slice/crio-41503ac468bcd7c9e2f2a4f704230f1858e8538bc5fb3001df061bcfe41d9556 WatchSource:0}: Error finding container 41503ac468bcd7c9e2f2a4f704230f1858e8538bc5fb3001df061bcfe41d9556: Status 404 returned error can't find the container with id 41503ac468bcd7c9e2f2a4f704230f1858e8538bc5fb3001df061bcfe41d9556 Apr 20 07:02:58.622030 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.622010 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode599c436f00e4332b7e91bed4b9b4c68.slice/crio-a486378803f881253b748657bd5570a2d88d8ea7d71fc4d457d2181373027b71 WatchSource:0}: Error finding container a486378803f881253b748657bd5570a2d88d8ea7d71fc4d457d2181373027b71: Status 404 returned error can't find the container with id a486378803f881253b748657bd5570a2d88d8ea7d71fc4d457d2181373027b71 Apr 20 07:02:58.628202 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.628176 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:02:58.633185 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.633165 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wsgz2" Apr 20 07:02:58.641889 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.641838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" event={"ID":"e599c436f00e4332b7e91bed4b9b4c68","Type":"ContainerStarted","Data":"a486378803f881253b748657bd5570a2d88d8ea7d71fc4d457d2181373027b71"} Apr 20 07:02:58.642989 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.642968 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" event={"ID":"94adb6b4b2c338418efcb12d086e4f21","Type":"ContainerStarted","Data":"41503ac468bcd7c9e2f2a4f704230f1858e8538bc5fb3001df061bcfe41d9556"} Apr 20 07:02:58.648295 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.648278 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wsgz2" Apr 20 07:02:58.670155 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.670124 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 07:02:58.681238 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-serviceca\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.681238 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-system-cni-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-cni-bin\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681276 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-cni-multus\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-sys-fs\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-modprobe-d\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cad302be-8478-42d0-b082-35206647ea39-etc-tuned\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-cni-multus\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-system-cni-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72490a6e-b0ae-4620-b092-0da4bc44739a-konnectivity-ca\") pod \"konnectivity-agent-p77qf\" (UID: \"72490a6e-b0ae-4620-b092-0da4bc44739a\") " pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681414 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-sys-fs\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681381 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-cni-bin\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681454 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-cnibin\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-cnibin\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-socket-dir-parent\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-modprobe-d\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681507 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-hostroot\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-conf-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-socket-dir-parent\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-lib-modules\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-conf-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-hostroot\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-lib-modules\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-var-lib-kubelet\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-var-lib-kubelet\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-os-release\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681734 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6176a82b-f332-4179-a624-af63e267945f-cni-binary-copy\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681723 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-os-release\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-registration-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.681969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-serviceca\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681795 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-systemd-units\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-registration-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-env-overrides\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681952 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-os-release\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-sys\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681974 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/72490a6e-b0ae-4620-b092-0da4bc44739a-konnectivity-ca\") pod \"konnectivity-agent-p77qf\" (UID: \"72490a6e-b0ae-4620-b092-0da4bc44739a\") " pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.681995 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovnkube-script-lib\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-host\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-sys\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-os-release\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-system-cni-dir\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk9wr\" (UniqueName: \"kubernetes.io/projected/959c5b72-4f9a-4701-953a-fb6d9cd24115-kube-api-access-dk9wr\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-host\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-system-cni-dir\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovnkube-config\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.682802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682136 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqltm\" (UniqueName: \"kubernetes.io/projected/6176a82b-f332-4179-a624-af63e267945f-kube-api-access-sqltm\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysconfig\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cad302be-8478-42d0-b082-35206647ea39-tmp\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682187 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/959c5b72-4f9a-4701-953a-fb6d9cd24115-iptables-alerter-script\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-kubelet\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682247 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysconfig\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682259 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-cni-netd\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6176a82b-f332-4179-a624-af63e267945f-multus-daemon-config\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6176a82b-f332-4179-a624-af63e267945f-cni-binary-copy\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682408 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-kubernetes\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682428 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysctl-d\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtsxf\" (UniqueName: \"kubernetes.io/projected/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-kube-api-access-dtsxf\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-ovn\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-k8s-cni-cncf-io\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682490 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-kubernetes\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-etc-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqcg\" (UniqueName: \"kubernetes.io/projected/8105e34c-977b-4426-8e5f-aacd3adbb4c9-kube-api-access-brqcg\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.683475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-multus-certs\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysctl-d\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-k8s-cni-cncf-io\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-run\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cnibin\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-multus-certs\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682698 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-run\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/959c5b72-4f9a-4701-953a-fb6d9cd24115-host-slash\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682724 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cnibin\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-run-netns\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682754 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6176a82b-f332-4179-a624-af63e267945f-multus-daemon-config\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-log-socket\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-cni-bin\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682813 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcv72\" (UniqueName: \"kubernetes.io/projected/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-kube-api-access-kcv72\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682822 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.684219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682866 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-node-log\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682898 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-systemd\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.682994 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-kubelet-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-systemd\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-var-lib-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.683074 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683077 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-kubelet\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.683132 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:02:59.183112084 +0000 UTC m=+2.057836938 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-var-lib-kubelet\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683213 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-socket-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-etc-selinux\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysctl-conf\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cni-binary-copy\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683303 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.684747 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683313 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-socket-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683350 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnw9t\" (UniqueName: \"kubernetes.io/projected/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-kube-api-access-cnw9t\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-etc-selinux\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683406 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-device-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683426 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cad302be-8478-42d0-b082-35206647ea39-etc-sysctl-conf\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683437 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683468 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-systemd\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-host\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/63684cad-527f-4cc2-8192-bbf6485c0f70-device-dir\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683517 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-cni-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-host\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683576 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-netns\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-multus-cni-dir\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhrmb\" (UniqueName: \"kubernetes.io/projected/cad302be-8478-42d0-b082-35206647ea39-kube-api-access-bhrmb\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-host-run-netns\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.685193 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-etc-kubernetes\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6176a82b-f332-4179-a624-af63e267945f-etc-kubernetes\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-slash\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683769 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683800 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72490a6e-b0ae-4620-b092-0da4bc44739a-agent-certs\") pod \"konnectivity-agent-p77qf\" (UID: \"72490a6e-b0ae-4620-b092-0da4bc44739a\") " pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683827 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxfp\" (UniqueName: \"kubernetes.io/projected/63684cad-527f-4cc2-8192-bbf6485c0f70-kube-api-access-lzxfp\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683849 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cni-binary-copy\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.683970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.684153 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.684926 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cad302be-8478-42d0-b082-35206647ea39-etc-tuned\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.685651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.685008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cad302be-8478-42d0-b082-35206647ea39-tmp\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.686230 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.686214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/72490a6e-b0ae-4620-b092-0da4bc44739a-agent-certs\") pod \"konnectivity-agent-p77qf\" (UID: \"72490a6e-b0ae-4620-b092-0da4bc44739a\") " pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.699001 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.698934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcv72\" (UniqueName: \"kubernetes.io/projected/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-kube-api-access-kcv72\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:58.699129 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.699098 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqltm\" (UniqueName: \"kubernetes.io/projected/6176a82b-f332-4179-a624-af63e267945f-kube-api-access-sqltm\") pod \"multus-j55n5\" (UID: \"6176a82b-f332-4179-a624-af63e267945f\") " pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.699129 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.699103 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:58.699129 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.699126 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:58.699236 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.699139 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:58.699236 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:58.699205 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:02:59.199187289 +0000 UTC m=+2.073912150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:58.699236 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.699202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhrmb\" (UniqueName: \"kubernetes.io/projected/cad302be-8478-42d0-b082-35206647ea39-kube-api-access-bhrmb\") pod \"tuned-2z746\" (UID: \"cad302be-8478-42d0-b082-35206647ea39\") " pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.700721 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.700706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnw9t\" (UniqueName: \"kubernetes.io/projected/9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c-kube-api-access-cnw9t\") pod \"node-ca-956b6\" (UID: \"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c\") " pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.700791 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.700768 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxfp\" (UniqueName: \"kubernetes.io/projected/63684cad-527f-4cc2-8192-bbf6485c0f70-kube-api-access-lzxfp\") pod \"aws-ebs-csi-driver-node-m8vfj\" (UID: \"63684cad-527f-4cc2-8192-bbf6485c0f70\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.700881 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.700865 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtsxf\" (UniqueName: \"kubernetes.io/projected/d333a39d-cbf2-4ee7-b67a-a1c9fa762779-kube-api-access-dtsxf\") pod \"multus-additional-cni-plugins-j8rhj\" (UID: \"d333a39d-cbf2-4ee7-b67a-a1c9fa762779\") " pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.784520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-slash\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-systemd-units\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-env-overrides\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784594 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784596 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovnkube-script-lib\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-run-ovn-kubernetes\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-slash\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-systemd-units\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784661 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk9wr\" (UniqueName: \"kubernetes.io/projected/959c5b72-4f9a-4701-953a-fb6d9cd24115-kube-api-access-dk9wr\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovnkube-config\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.784726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/959c5b72-4f9a-4701-953a-fb6d9cd24115-iptables-alerter-script\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-kubelet\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784761 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-cni-netd\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-ovn\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-etc-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brqcg\" (UniqueName: \"kubernetes.io/projected/8105e34c-977b-4426-8e5f-aacd3adbb4c9-kube-api-access-brqcg\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784866 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/959c5b72-4f9a-4701-953a-fb6d9cd24115-host-slash\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784886 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-run-netns\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784909 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-log-socket\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784931 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-cni-bin\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-node-log\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784962 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-etc-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.784967 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/959c5b72-4f9a-4701-953a-fb6d9cd24115-host-slash\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-kubelet\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-var-lib-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785017 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-node-log\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-run-netns\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785057 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-systemd\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785094 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-log-socket\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-cni-bin\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785060 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-ovn\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-cni-netd\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-run-systemd\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-env-overrides\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785191 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8105e34c-977b-4426-8e5f-aacd3adbb4c9-var-lib-openvswitch\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/959c5b72-4f9a-4701-953a-fb6d9cd24115-iptables-alerter-script\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovnkube-script-lib\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.785946 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.785677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovnkube-config\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.787284 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.787266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8105e34c-977b-4426-8e5f-aacd3adbb4c9-ovn-node-metrics-cert\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.796887 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.796866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqcg\" (UniqueName: \"kubernetes.io/projected/8105e34c-977b-4426-8e5f-aacd3adbb4c9-kube-api-access-brqcg\") pod \"ovnkube-node-qvqrn\" (UID: \"8105e34c-977b-4426-8e5f-aacd3adbb4c9\") " pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.798041 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.798027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk9wr\" (UniqueName: \"kubernetes.io/projected/959c5b72-4f9a-4701-953a-fb6d9cd24115-kube-api-access-dk9wr\") pod \"iptables-alerter-qgz5k\" (UID: \"959c5b72-4f9a-4701-953a-fb6d9cd24115\") " pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.883248 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.883154 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:02:58.889025 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.889003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-956b6" Apr 20 07:02:58.890974 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.890947 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72490a6e_b0ae_4620_b092_0da4bc44739a.slice/crio-8414d5d6a38f26a9bd3dccc60f05a2ae9b19b68099c509f7ea2652932e9f7e8c WatchSource:0}: Error finding container 8414d5d6a38f26a9bd3dccc60f05a2ae9b19b68099c509f7ea2652932e9f7e8c: Status 404 returned error can't find the container with id 8414d5d6a38f26a9bd3dccc60f05a2ae9b19b68099c509f7ea2652932e9f7e8c Apr 20 07:02:58.896028 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.896005 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a4f5c92_f99d_423c_a0fa_9f9d79abbd7c.slice/crio-88ec89b41ba6fbbd557bcfce43ebd55834e95f1461f0a9bb02062d71fd40b1b2 WatchSource:0}: Error finding container 88ec89b41ba6fbbd557bcfce43ebd55834e95f1461f0a9bb02062d71fd40b1b2: Status 404 returned error can't find the container with id 88ec89b41ba6fbbd557bcfce43ebd55834e95f1461f0a9bb02062d71fd40b1b2 Apr 20 07:02:58.904545 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.904529 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j55n5" Apr 20 07:02:58.911017 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.910995 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6176a82b_f332_4179_a624_af63e267945f.slice/crio-c84d8c13ed799fa1ffeb6c11db2dd489af1e6423ba07265dab67a46c36c7604d WatchSource:0}: Error finding container c84d8c13ed799fa1ffeb6c11db2dd489af1e6423ba07265dab67a46c36c7604d: Status 404 returned error can't find the container with id c84d8c13ed799fa1ffeb6c11db2dd489af1e6423ba07265dab67a46c36c7604d Apr 20 07:02:58.923099 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.923078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" Apr 20 07:02:58.928434 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.928414 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63684cad_527f_4cc2_8192_bbf6485c0f70.slice/crio-becef04ac14f6c86e6ea92778caf096a5cebc652cbaa0334071a62f216a4beca WatchSource:0}: Error finding container becef04ac14f6c86e6ea92778caf096a5cebc652cbaa0334071a62f216a4beca: Status 404 returned error can't find the container with id becef04ac14f6c86e6ea92778caf096a5cebc652cbaa0334071a62f216a4beca Apr 20 07:02:58.929142 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.929121 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2z746" Apr 20 07:02:58.934577 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.934557 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad302be_8478_42d0_b082_35206647ea39.slice/crio-13aefe445c7d63d86f67b30b6759ec4c225a6ce18563b571d04db0fd7683a904 WatchSource:0}: Error finding container 13aefe445c7d63d86f67b30b6759ec4c225a6ce18563b571d04db0fd7683a904: Status 404 returned error can't find the container with id 13aefe445c7d63d86f67b30b6759ec4c225a6ce18563b571d04db0fd7683a904 Apr 20 07:02:58.935509 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.935494 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" Apr 20 07:02:58.941198 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.941177 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd333a39d_cbf2_4ee7_b67a_a1c9fa762779.slice/crio-92eec5018af3976c9d0eb5312213a477b386951019c86a85a3952819d67a78a9 WatchSource:0}: Error finding container 92eec5018af3976c9d0eb5312213a477b386951019c86a85a3952819d67a78a9: Status 404 returned error can't find the container with id 92eec5018af3976c9d0eb5312213a477b386951019c86a85a3952819d67a78a9 Apr 20 07:02:58.942870 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.942853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgz5k" Apr 20 07:02:58.947484 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:58.947468 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:02:58.948387 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.948368 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959c5b72_4f9a_4701_953a_fb6d9cd24115.slice/crio-56571343eb1da491755cd1478dcc5d662c446196473aff2d4825efca3e695558 WatchSource:0}: Error finding container 56571343eb1da491755cd1478dcc5d662c446196473aff2d4825efca3e695558: Status 404 returned error can't find the container with id 56571343eb1da491755cd1478dcc5d662c446196473aff2d4825efca3e695558 Apr 20 07:02:58.953469 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:02:58.953450 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8105e34c_977b_4426_8e5f_aacd3adbb4c9.slice/crio-c089780138b52e61d3b6fe380a4c1bc2b0f705ffe17ec9c3a1e62dedb5102dfc WatchSource:0}: Error finding container c089780138b52e61d3b6fe380a4c1bc2b0f705ffe17ec9c3a1e62dedb5102dfc: Status 404 returned error can't find the container with id c089780138b52e61d3b6fe380a4c1bc2b0f705ffe17ec9c3a1e62dedb5102dfc Apr 20 07:02:59.188053 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.187956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:02:59.188185 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:59.188129 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:59.188251 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:59.188186 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:00.188167583 +0000 UTC m=+3.062892426 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:59.289135 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.289101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:02:59.289302 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:59.289263 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:59.289302 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:59.289283 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:59.289302 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:59.289294 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:59.289482 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:02:59.289369 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:00.289349721 +0000 UTC m=+3.164074578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:59.464875 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.464777 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:59.649034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.648954 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 06:57:58 +0000 UTC" deadline="2027-11-09 11:59:32.623309212 +0000 UTC" Apr 20 07:02:59.649034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.648988 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13636h56m32.974325074s" Apr 20 07:02:59.667659 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.667579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-956b6" event={"ID":"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c","Type":"ContainerStarted","Data":"88ec89b41ba6fbbd557bcfce43ebd55834e95f1461f0a9bb02062d71fd40b1b2"} Apr 20 07:02:59.674872 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.674813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgz5k" event={"ID":"959c5b72-4f9a-4701-953a-fb6d9cd24115","Type":"ContainerStarted","Data":"56571343eb1da491755cd1478dcc5d662c446196473aff2d4825efca3e695558"} Apr 20 07:02:59.677362 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.677296 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2z746" event={"ID":"cad302be-8478-42d0-b082-35206647ea39","Type":"ContainerStarted","Data":"13aefe445c7d63d86f67b30b6759ec4c225a6ce18563b571d04db0fd7683a904"} Apr 20 07:02:59.680582 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.680554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" event={"ID":"63684cad-527f-4cc2-8192-bbf6485c0f70","Type":"ContainerStarted","Data":"becef04ac14f6c86e6ea92778caf096a5cebc652cbaa0334071a62f216a4beca"} Apr 20 07:02:59.687837 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.687807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p77qf" event={"ID":"72490a6e-b0ae-4620-b092-0da4bc44739a","Type":"ContainerStarted","Data":"8414d5d6a38f26a9bd3dccc60f05a2ae9b19b68099c509f7ea2652932e9f7e8c"} Apr 20 07:02:59.694829 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.694665 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"c089780138b52e61d3b6fe380a4c1bc2b0f705ffe17ec9c3a1e62dedb5102dfc"} Apr 20 07:02:59.703398 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.703372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerStarted","Data":"92eec5018af3976c9d0eb5312213a477b386951019c86a85a3952819d67a78a9"} Apr 20 07:02:59.710680 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.710654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j55n5" event={"ID":"6176a82b-f332-4179-a624-af63e267945f","Type":"ContainerStarted","Data":"c84d8c13ed799fa1ffeb6c11db2dd489af1e6423ba07265dab67a46c36c7604d"} Apr 20 07:02:59.958981 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:02:59.958948 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:03:00.196416 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.196377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:00.196581 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.196559 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:00.196656 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.196622 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:02.196603337 +0000 UTC m=+5.071328191 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:00.297696 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.297611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:00.297866 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.297817 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:00.297866 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.297838 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:00.297866 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.297850 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:00.298029 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.297908 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:02.29788931 +0000 UTC m=+5.172614164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:00.638973 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.638927 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:00.639479 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.639051 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:00.639479 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.639448 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:00.639589 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:00.639554 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:00.650095 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.650016 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 06:57:58 +0000 UTC" deadline="2028-01-14 08:26:26.627968844 +0000 UTC" Apr 20 07:03:00.650095 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.650046 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15217h23m25.977926773s" Apr 20 07:03:00.818506 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:00.818255 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:03:02.212727 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.212695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:02.213150 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.212833 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:02.213150 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.212883 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:06.212870415 +0000 UTC m=+9.087595259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:02.313784 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.313741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:02.313971 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.313901 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:02.313971 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.313920 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:02.313971 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.313933 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:02.314105 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.313990 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:06.313971791 +0000 UTC m=+9.188696646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:02.638889 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.638857 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:02.639084 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.638990 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:02.639501 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.639462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:02.639602 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:02.639569 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:02.971862 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.971049 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cz5pm"] Apr 20 07:03:02.974948 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.974264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:02.977950 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.977714 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 07:03:02.977950 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.977897 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 07:03:02.978116 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:02.977978 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n8f2t\"" Apr 20 07:03:03.019632 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.019593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c340eac2-744d-4e2d-a3d3-b064aeed4bde-hosts-file\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.019808 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.019652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c340eac2-744d-4e2d-a3d3-b064aeed4bde-tmp-dir\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.019808 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.019683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpdl\" (UniqueName: \"kubernetes.io/projected/c340eac2-744d-4e2d-a3d3-b064aeed4bde-kube-api-access-pvpdl\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.120735 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.120689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c340eac2-744d-4e2d-a3d3-b064aeed4bde-hosts-file\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.120887 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.120752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c340eac2-744d-4e2d-a3d3-b064aeed4bde-tmp-dir\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.120887 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.120782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpdl\" (UniqueName: \"kubernetes.io/projected/c340eac2-744d-4e2d-a3d3-b064aeed4bde-kube-api-access-pvpdl\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.121296 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.121275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c340eac2-744d-4e2d-a3d3-b064aeed4bde-hosts-file\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.122140 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.122071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c340eac2-744d-4e2d-a3d3-b064aeed4bde-tmp-dir\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.143487 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.143456 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpdl\" (UniqueName: \"kubernetes.io/projected/c340eac2-744d-4e2d-a3d3-b064aeed4bde-kube-api-access-pvpdl\") pod \"node-resolver-cz5pm\" (UID: \"c340eac2-744d-4e2d-a3d3-b064aeed4bde\") " pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:03.290185 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:03.290101 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cz5pm" Apr 20 07:03:04.640356 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:04.639891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:04.640356 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:04.639891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:04.640356 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:04.640044 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:04.640356 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:04.640073 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:06.247176 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:06.247132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:06.247736 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.247265 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:06.247736 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.247380 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.247356027 +0000 UTC m=+17.122080882 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:06.348440 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:06.348401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:06.348627 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.348590 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:06.348627 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.348615 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:06.348627 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.348629 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:06.348796 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.348689 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:14.348670513 +0000 UTC m=+17.223395365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:06.639008 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:06.638974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:06.639175 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.639106 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:06.639677 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:06.639515 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:06.639677 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:06.639634 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:08.639916 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:08.639877 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:08.640343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:08.639877 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:08.640343 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:08.640028 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:08.640343 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:08.640098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:10.638995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:10.638958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:10.639438 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:10.638958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:10.639438 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:10.639102 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:10.639438 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:10.639216 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:12.639875 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:12.639842 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:12.639875 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:12.639876 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:12.640297 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:12.639964 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:12.640297 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:12.640079 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:14.302608 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:14.302573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:14.303083 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.302718 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:14.303083 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.302780 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:30.302765912 +0000 UTC m=+33.177490755 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:14.403496 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:14.403454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:14.403671 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.403648 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:14.403745 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.403687 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:14.403745 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.403701 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:14.403825 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.403777 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:03:30.403756651 +0000 UTC m=+33.278481494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:14.638989 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:14.638960 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:14.639269 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:14.638962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:14.639269 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.639102 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:14.639269 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:14.639165 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:16.639829 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:16.639764 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:16.640266 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:16.639771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:16.640266 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:16.639894 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:16.640266 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:16.639956 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:17.755227 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.754158 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" event={"ID":"e599c436f00e4332b7e91bed4b9b4c68","Type":"ContainerStarted","Data":"512d60b1048cf6ae970d168ca55149bc519a39f3d25d5502d83abfaecf7bdc06"} Apr 20 07:03:17.759947 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.759396 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2z746" event={"ID":"cad302be-8478-42d0-b082-35206647ea39","Type":"ContainerStarted","Data":"55985aee50fe8d0cbdf67fd82c210a939cbded42af832957ccbd59ca467b8acf"} Apr 20 07:03:17.763949 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.763908 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cz5pm" event={"ID":"c340eac2-744d-4e2d-a3d3-b064aeed4bde","Type":"ContainerStarted","Data":"b86f0c7faeafbc5090d7f68d1c559f2a573648161adbdae9086e331e8f325e53"} Apr 20 07:03:17.767585 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.767489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"ee1273d1163a03dac496a0a339848929c76e0b33bf603511a13ce66bd48b0fe7"} Apr 20 07:03:17.767669 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.767650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"691081e84c1596179dcbde578178f8b7a0036a7b33662e6454003d048f89656d"} Apr 20 07:03:17.767723 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.767669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"e69a0b7f9d2fba28f49f0b2d2562d2d19955ca6f4511d159853d4140ce3aef29"} Apr 20 07:03:17.767723 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.767678 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"c2e6542a25093bfb4373c3787956954c938d45c9a171f1c73380ca365a3d7686"} Apr 20 07:03:17.767723 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.767686 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"01ac166727539f43e1122daaf4ccd61c2985023a29500a7a1a4caacfa363ac38"} Apr 20 07:03:17.767723 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.767694 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"2ee5627feae14537c33b10fcf8d2a30dd888d8d152b27755ea38671e27d5eb05"} Apr 20 07:03:17.769843 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.769270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j55n5" event={"ID":"6176a82b-f332-4179-a624-af63e267945f","Type":"ContainerStarted","Data":"090ed4cbacc17364a7b230ff1f567c6544aa87b80ee74a6e4101ef2a6ceb5901"} Apr 20 07:03:17.830720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.830679 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-178.ec2.internal" podStartSLOduration=19.830662077 podStartE2EDuration="19.830662077s" podCreationTimestamp="2026-04-20 07:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:17.792355325 +0000 UTC m=+20.667080184" watchObservedRunningTime="2026-04-20 07:03:17.830662077 +0000 UTC m=+20.705386940" Apr 20 07:03:17.872980 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:17.872940 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j55n5" podStartSLOduration=2.726634733 podStartE2EDuration="20.872924381s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.912989918 +0000 UTC m=+1.787714759" lastFinishedPulling="2026-04-20 07:03:17.059279551 +0000 UTC m=+19.934004407" observedRunningTime="2026-04-20 07:03:17.834648257 +0000 UTC m=+20.709373120" watchObservedRunningTime="2026-04-20 07:03:17.872924381 +0000 UTC m=+20.747649224" Apr 20 07:03:18.638998 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.638975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:18.639116 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.638975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:18.639116 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:18.639085 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:18.639213 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:18.639185 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:18.677712 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.677683 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 07:03:18.771830 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.771792 2577 generic.go:358] "Generic (PLEG): container finished" podID="d333a39d-cbf2-4ee7-b67a-a1c9fa762779" containerID="6b28bb16de3ffc2b9d58f4f2163986eac6d125b3ddf24e95b1a5b7f51d5cd980" exitCode=0 Apr 20 07:03:18.772266 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.771888 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerDied","Data":"6b28bb16de3ffc2b9d58f4f2163986eac6d125b3ddf24e95b1a5b7f51d5cd980"} Apr 20 07:03:18.773175 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.773144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-956b6" event={"ID":"9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c","Type":"ContainerStarted","Data":"6585e057aebc3df47c103e833a43c68438af2f5302d55bd940ad98eb758e3537"} Apr 20 07:03:18.774515 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.774495 2577 generic.go:358] "Generic (PLEG): container finished" podID="94adb6b4b2c338418efcb12d086e4f21" containerID="82683c19a82d027256e54207462b60f37eb8a3d6c0b364923fef7a255b62f68a" exitCode=0 Apr 20 07:03:18.774616 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.774564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" event={"ID":"94adb6b4b2c338418efcb12d086e4f21","Type":"ContainerDied","Data":"82683c19a82d027256e54207462b60f37eb8a3d6c0b364923fef7a255b62f68a"} Apr 20 07:03:18.775827 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.775809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgz5k" event={"ID":"959c5b72-4f9a-4701-953a-fb6d9cd24115","Type":"ContainerStarted","Data":"931db7f14f8082e34179873908f19b065cf47665000c952f49ec3963c8419077"} Apr 20 07:03:18.777419 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.777385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" event={"ID":"63684cad-527f-4cc2-8192-bbf6485c0f70","Type":"ContainerStarted","Data":"b0d253dd8669be5964e45cf2ca14d036492701b9e414e3f909f3d32ff55fcf54"} Apr 20 07:03:18.777419 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.777409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" event={"ID":"63684cad-527f-4cc2-8192-bbf6485c0f70","Type":"ContainerStarted","Data":"b09c67e1b6803412bb9e2c08e3b46d89199092b2735c790e80e2f20244c197c4"} Apr 20 07:03:18.778691 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.778673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-p77qf" event={"ID":"72490a6e-b0ae-4620-b092-0da4bc44739a","Type":"ContainerStarted","Data":"2a10862aa74296d5fed7a52c49a17f5683ecd46ccff9a8b3e0347d8ad3f0e513"} Apr 20 07:03:18.780062 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.780041 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cz5pm" event={"ID":"c340eac2-744d-4e2d-a3d3-b064aeed4bde","Type":"ContainerStarted","Data":"8bcd0b495e7302458b20f8d2bffdf68651f21356d09602e61ff4198a5be7143f"} Apr 20 07:03:18.816814 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.816729 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2z746" podStartSLOduration=3.718367674 podStartE2EDuration="21.816713331s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.936392749 +0000 UTC m=+1.811117589" lastFinishedPulling="2026-04-20 07:03:17.034738403 +0000 UTC m=+19.909463246" observedRunningTime="2026-04-20 07:03:17.87440913 +0000 UTC m=+20.749133993" watchObservedRunningTime="2026-04-20 07:03:18.816713331 +0000 UTC m=+21.691438193" Apr 20 07:03:18.841983 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.841945 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qgz5k" podStartSLOduration=3.764069564 podStartE2EDuration="21.841933532s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.95010351 +0000 UTC m=+1.824828354" lastFinishedPulling="2026-04-20 07:03:17.02796747 +0000 UTC m=+19.902692322" observedRunningTime="2026-04-20 07:03:18.84191663 +0000 UTC m=+21.716641494" watchObservedRunningTime="2026-04-20 07:03:18.841933532 +0000 UTC m=+21.716658394" Apr 20 07:03:18.908001 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.907952 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-p77qf" podStartSLOduration=3.772859365 podStartE2EDuration="21.90793698s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.892787691 +0000 UTC m=+1.767512530" lastFinishedPulling="2026-04-20 07:03:17.02786529 +0000 UTC m=+19.902590145" observedRunningTime="2026-04-20 07:03:18.874148345 +0000 UTC m=+21.748873207" watchObservedRunningTime="2026-04-20 07:03:18.90793698 +0000 UTC m=+21.782661842" Apr 20 07:03:18.908115 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.908030 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-956b6" podStartSLOduration=3.7776234410000002 podStartE2EDuration="21.908026131s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.897517867 +0000 UTC m=+1.772242715" lastFinishedPulling="2026-04-20 07:03:17.027920552 +0000 UTC m=+19.902645405" observedRunningTime="2026-04-20 07:03:18.907870023 +0000 UTC m=+21.782594887" watchObservedRunningTime="2026-04-20 07:03:18.908026131 +0000 UTC m=+21.782750992" Apr 20 07:03:18.969088 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:18.969033 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cz5pm" podStartSLOduration=16.969018067 podStartE2EDuration="16.969018067s" podCreationTimestamp="2026-04-20 07:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:18.937873365 +0000 UTC m=+21.812598237" watchObservedRunningTime="2026-04-20 07:03:18.969018067 +0000 UTC m=+21.843742946" Apr 20 07:03:19.642304 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.641998 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T07:03:18.677710135Z","UUID":"a5d06489-05dd-4934-ba35-e0ad2627af2e","Handler":null,"Name":"","Endpoint":""} Apr 20 07:03:19.643873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.643851 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 07:03:19.643873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.643871 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 07:03:19.783444 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.783419 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" event={"ID":"94adb6b4b2c338418efcb12d086e4f21","Type":"ContainerStarted","Data":"57662010561dafbe3c4807916afa48d4b3becbca9155eef9a0f1a3f4f43511d3"} Apr 20 07:03:19.786371 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.785647 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" event={"ID":"63684cad-527f-4cc2-8192-bbf6485c0f70","Type":"ContainerStarted","Data":"46295ace7dec10081866354a1244c9421a6bc746aaa9fb649f39814c3397f517"} Apr 20 07:03:19.789182 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.789152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"063fc8ab6c05772f8dfe16d424aee5d817f6003d219efe5d10e9ef0c0bdff2ef"} Apr 20 07:03:19.816717 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:19.816612 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-178.ec2.internal" podStartSLOduration=21.816587245 podStartE2EDuration="21.816587245s" podCreationTimestamp="2026-04-20 07:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:19.814777525 +0000 UTC m=+22.689502386" watchObservedRunningTime="2026-04-20 07:03:19.816587245 +0000 UTC m=+22.691312108" Apr 20 07:03:20.639203 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:20.639178 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:20.639380 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:20.639178 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:20.639440 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:20.639267 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:20.639440 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:20.639401 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:21.046233 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:21.046162 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:03:21.081281 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:21.081252 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:03:21.081995 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:21.081967 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:03:21.113908 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:21.113862 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-m8vfj" podStartSLOduration=3.493273246 podStartE2EDuration="24.113849562s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.930079562 +0000 UTC m=+1.804804406" lastFinishedPulling="2026-04-20 07:03:19.550655882 +0000 UTC m=+22.425380722" observedRunningTime="2026-04-20 07:03:19.845855236 +0000 UTC m=+22.720580097" watchObservedRunningTime="2026-04-20 07:03:21.113849562 +0000 UTC m=+23.988574424" Apr 20 07:03:21.792912 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:21.792885 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-p77qf" Apr 20 07:03:22.639164 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:22.639135 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:22.639164 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:22.639149 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:22.639749 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:22.639255 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:22.639749 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:22.639417 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:23.801547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:23.801295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" event={"ID":"8105e34c-977b-4426-8e5f-aacd3adbb4c9","Type":"ContainerStarted","Data":"f6edc705da854a1ed7aa78a22bb0697a6920c45e794f57c5c8b2042fb6856b6b"} Apr 20 07:03:23.891698 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:23.891674 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cz5pm_c340eac2-744d-4e2d-a3d3-b064aeed4bde/dns-node-resolver/0.log" Apr 20 07:03:24.639033 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.638998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:24.639177 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:24.639098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:24.639177 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.639155 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:24.639250 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:24.639232 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:24.805085 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.805053 2577 generic.go:358] "Generic (PLEG): container finished" podID="d333a39d-cbf2-4ee7-b67a-a1c9fa762779" containerID="395d026e65025ed00f7a458fe7eab84ee29ec99b6e41e50a1bd4a74a81c3bfa5" exitCode=0 Apr 20 07:03:24.805533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.805139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerDied","Data":"395d026e65025ed00f7a458fe7eab84ee29ec99b6e41e50a1bd4a74a81c3bfa5"} Apr 20 07:03:24.805641 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.805623 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:03:24.805688 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.805655 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:03:24.805688 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.805668 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:03:24.820168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.820141 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:03:24.820802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.820779 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:03:24.822673 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.822658 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-956b6_9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c/node-ca/0.log" Apr 20 07:03:24.848749 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:24.848710 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" podStartSLOduration=9.700571923 podStartE2EDuration="27.848698358s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.955092968 +0000 UTC m=+1.829817808" lastFinishedPulling="2026-04-20 07:03:17.10321939 +0000 UTC m=+19.977944243" observedRunningTime="2026-04-20 07:03:23.871279941 +0000 UTC m=+26.746004803" watchObservedRunningTime="2026-04-20 07:03:24.848698358 +0000 UTC m=+27.723423244" Apr 20 07:03:25.683439 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:25.683411 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sq8g5"] Apr 20 07:03:25.683575 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:25.683495 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:25.683613 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:25.683568 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:25.687421 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:25.687395 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xw95j"] Apr 20 07:03:25.687517 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:25.687467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:25.687564 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:25.687551 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:26.809408 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:26.809374 2577 generic.go:358] "Generic (PLEG): container finished" podID="d333a39d-cbf2-4ee7-b67a-a1c9fa762779" containerID="1c1d0f1ad99f20d7c05d6e1d2fe65f8af980b603ada08e4cf31f68b8d6ba4f67" exitCode=0 Apr 20 07:03:26.809937 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:26.809471 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerDied","Data":"1c1d0f1ad99f20d7c05d6e1d2fe65f8af980b603ada08e4cf31f68b8d6ba4f67"} Apr 20 07:03:27.640099 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:27.640071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:27.640281 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:27.640151 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:27.640281 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:27.640194 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:27.640281 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:27.640229 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:28.818044 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:28.818007 2577 generic.go:358] "Generic (PLEG): container finished" podID="d333a39d-cbf2-4ee7-b67a-a1c9fa762779" containerID="dccf093bb61cb882ad0ade4cb7fb7cdb48bdb4a9d55eb9f5598c34da0d727e27" exitCode=0 Apr 20 07:03:28.818547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:28.818060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerDied","Data":"dccf093bb61cb882ad0ade4cb7fb7cdb48bdb4a9d55eb9f5598c34da0d727e27"} Apr 20 07:03:29.639228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:29.639196 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:29.639422 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:29.639292 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:29.639564 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:29.639539 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:29.639720 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:29.639675 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:30.318577 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:30.318543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:30.319013 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:30.318707 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:30.319013 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:30.318775 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs podName:3bdbb47e-e79d-4aaa-9671-3899c229b1a2 nodeName:}" failed. No retries permitted until 2026-04-20 07:04:02.318754695 +0000 UTC m=+65.193479549 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs") pod "network-metrics-daemon-xw95j" (UID: "3bdbb47e-e79d-4aaa-9671-3899c229b1a2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:30.419271 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:30.419236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:30.419474 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:30.419435 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:30.419474 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:30.419460 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:30.419474 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:30.419472 2577 projected.go:194] Error preparing data for projected volume kube-api-access-kffvv for pod openshift-network-diagnostics/network-check-target-sq8g5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:30.419637 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:30.419532 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv podName:fb646649-ea68-4bb4-87df-a2eed82cc86c nodeName:}" failed. No retries permitted until 2026-04-20 07:04:02.41951174 +0000 UTC m=+65.294236583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-kffvv" (UniqueName: "kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv") pod "network-check-target-sq8g5" (UID: "fb646649-ea68-4bb4-87df-a2eed82cc86c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:31.639186 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:31.639142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:31.639812 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:31.639271 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:31.639812 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:31.639347 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:31.639812 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:31.639425 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:33.640034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:33.639754 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:33.640034 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:33.639898 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:33.640034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:33.639937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:33.640034 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:33.640006 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:35.639459 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:35.639423 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:35.639459 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:35.639441 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:35.639857 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:35.639543 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:35.639857 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:35.639670 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:36.835079 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:36.835050 2577 generic.go:358] "Generic (PLEG): container finished" podID="d333a39d-cbf2-4ee7-b67a-a1c9fa762779" containerID="48d97b3f14f672371c1261dc3a931b6f152302959f5122ad84a42020a3f1738e" exitCode=0 Apr 20 07:03:36.835522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:36.835096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerDied","Data":"48d97b3f14f672371c1261dc3a931b6f152302959f5122ad84a42020a3f1738e"} Apr 20 07:03:37.639709 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:37.639677 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:37.639887 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:37.639780 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:37.639887 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:37.639780 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:37.639887 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:37.639845 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:37.839718 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:37.839681 2577 generic.go:358] "Generic (PLEG): container finished" podID="d333a39d-cbf2-4ee7-b67a-a1c9fa762779" containerID="76cc2ef00da063c4a84b299dc332e7e26fd14528239787bd462d8a15bf003a59" exitCode=0 Apr 20 07:03:37.840079 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:37.839738 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerDied","Data":"76cc2ef00da063c4a84b299dc332e7e26fd14528239787bd462d8a15bf003a59"} Apr 20 07:03:38.844700 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:38.844669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" event={"ID":"d333a39d-cbf2-4ee7-b67a-a1c9fa762779","Type":"ContainerStarted","Data":"6b85f41dc3922e4ccc60009815781a51d2642f4f7a60918c6ea07c7fffde719a"} Apr 20 07:03:38.870715 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:38.870664 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j8rhj" podStartSLOduration=5.076226183 podStartE2EDuration="41.870647946s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:02:58.942793384 +0000 UTC m=+1.817518224" lastFinishedPulling="2026-04-20 07:03:35.737215134 +0000 UTC m=+38.611939987" observedRunningTime="2026-04-20 07:03:38.869314173 +0000 UTC m=+41.744039035" watchObservedRunningTime="2026-04-20 07:03:38.870647946 +0000 UTC m=+41.745372810" Apr 20 07:03:39.638863 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:39.638825 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:39.639050 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:39.638928 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:39.639050 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:39.638984 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:39.639163 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:39.639054 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:41.639478 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:41.639444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:41.639917 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:41.639457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:41.639917 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:41.639559 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:41.639917 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:41.639680 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:43.638879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:43.638847 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:43.639272 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:43.638949 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:43.639272 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:43.639028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:43.639272 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:43.639119 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:45.639451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:45.639419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:45.639451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:45.639434 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:45.639830 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:45.639530 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:45.639830 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:45.639595 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:47.638976 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:47.638944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:47.638976 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:47.638958 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:47.640169 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:47.640130 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:47.640355 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:47.640229 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:49.639622 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:49.639586 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:49.640147 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:49.639592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:49.640147 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:49.639691 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sq8g5" podUID="fb646649-ea68-4bb4-87df-a2eed82cc86c" Apr 20 07:03:49.640147 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:03:49.639833 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xw95j" podUID="3bdbb47e-e79d-4aaa-9671-3899c229b1a2" Apr 20 07:03:50.976135 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:50.976109 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-178.ec2.internal" event="NodeReady" Apr 20 07:03:50.976578 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:50.976218 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 07:03:51.182430 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.182396 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x528h"] Apr 20 07:03:51.195793 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.195766 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.208532 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.208510 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhcpq\"" Apr 20 07:03:51.208769 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.208755 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 07:03:51.209107 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.209085 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 07:03:51.236421 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.236350 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x528h"] Apr 20 07:03:51.275653 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.275627 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gtfrg"] Apr 20 07:03:51.290989 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.290963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.302398 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.302367 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 07:03:51.302398 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.302400 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 07:03:51.302602 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.302479 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-75ldp\"" Apr 20 07:03:51.302659 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.302643 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 07:03:51.326759 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.326719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gtfrg"] Apr 20 07:03:51.369465 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.369417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d38a5ffa-c2cb-44b1-823f-627443e996e0-tmp-dir\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.369465 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.369465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nfcb\" (UniqueName: \"kubernetes.io/projected/d38a5ffa-c2cb-44b1-823f-627443e996e0-kube-api-access-7nfcb\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.369693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.369489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38a5ffa-c2cb-44b1-823f-627443e996e0-metrics-tls\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.369693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.369578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d38a5ffa-c2cb-44b1-823f-627443e996e0-config-volume\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.382831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.382799 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fr8mx"] Apr 20 07:03:51.426967 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.426935 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.430444 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.430415 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nfwjb\"" Apr 20 07:03:51.431034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.431008 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 07:03:51.432947 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.431702 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 07:03:51.433071 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.432036 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 07:03:51.433071 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.432053 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fr8mx"] Apr 20 07:03:51.433071 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.432490 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 07:03:51.470503 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d38a5ffa-c2cb-44b1-823f-627443e996e0-config-volume\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.470680 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d38a5ffa-c2cb-44b1-823f-627443e996e0-tmp-dir\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.470680 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsj8l\" (UniqueName: \"kubernetes.io/projected/810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c-kube-api-access-zsj8l\") pod \"ingress-canary-gtfrg\" (UID: \"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c\") " pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.470765 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nfcb\" (UniqueName: \"kubernetes.io/projected/d38a5ffa-c2cb-44b1-823f-627443e996e0-kube-api-access-7nfcb\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.470765 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470737 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c-cert\") pod \"ingress-canary-gtfrg\" (UID: \"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c\") " pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.470854 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38a5ffa-c2cb-44b1-823f-627443e996e0-metrics-tls\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.470854 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.470806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d38a5ffa-c2cb-44b1-823f-627443e996e0-tmp-dir\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.471125 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.471104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d38a5ffa-c2cb-44b1-823f-627443e996e0-config-volume\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.475243 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.475222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d38a5ffa-c2cb-44b1-823f-627443e996e0-metrics-tls\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.503513 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.503438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nfcb\" (UniqueName: \"kubernetes.io/projected/d38a5ffa-c2cb-44b1-823f-627443e996e0-kube-api-access-7nfcb\") pod \"dns-default-x528h\" (UID: \"d38a5ffa-c2cb-44b1-823f-627443e996e0\") " pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.504818 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.504800 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x528h" Apr 20 07:03:51.571448 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571406 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e21e2393-ce5e-4856-84ba-7b921e943d5f-data-volume\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.571448 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571449 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qf6r\" (UniqueName: \"kubernetes.io/projected/e21e2393-ce5e-4856-84ba-7b921e943d5f-kube-api-access-8qf6r\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.571664 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e21e2393-ce5e-4856-84ba-7b921e943d5f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.571664 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e21e2393-ce5e-4856-84ba-7b921e943d5f-crio-socket\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.571664 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsj8l\" (UniqueName: \"kubernetes.io/projected/810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c-kube-api-access-zsj8l\") pod \"ingress-canary-gtfrg\" (UID: \"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c\") " pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.571664 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e21e2393-ce5e-4856-84ba-7b921e943d5f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.571664 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.571638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c-cert\") pod \"ingress-canary-gtfrg\" (UID: \"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c\") " pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.574137 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.574119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c-cert\") pod \"ingress-canary-gtfrg\" (UID: \"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c\") " pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.639045 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.639016 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:03:51.639045 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.639063 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:03:51.641554 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.641527 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsj8l\" (UniqueName: \"kubernetes.io/projected/810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c-kube-api-access-zsj8l\") pod \"ingress-canary-gtfrg\" (UID: \"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c\") " pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:51.644624 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.644599 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:03:51.646010 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.645989 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mf5sv\"" Apr 20 07:03:51.652018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.651992 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:03:51.652143 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.652061 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g4l9z\"" Apr 20 07:03:51.652143 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.652102 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:03:51.672962 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.672939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e21e2393-ce5e-4856-84ba-7b921e943d5f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673073 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.672981 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e21e2393-ce5e-4856-84ba-7b921e943d5f-data-volume\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673073 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.673008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qf6r\" (UniqueName: \"kubernetes.io/projected/e21e2393-ce5e-4856-84ba-7b921e943d5f-kube-api-access-8qf6r\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673073 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.673067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e21e2393-ce5e-4856-84ba-7b921e943d5f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673242 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.673093 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e21e2393-ce5e-4856-84ba-7b921e943d5f-crio-socket\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673242 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.673172 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e21e2393-ce5e-4856-84ba-7b921e943d5f-crio-socket\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673393 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.673375 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e21e2393-ce5e-4856-84ba-7b921e943d5f-data-volume\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.673651 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.673626 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e21e2393-ce5e-4856-84ba-7b921e943d5f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.675330 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.675300 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e21e2393-ce5e-4856-84ba-7b921e943d5f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.716635 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.716605 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x528h"] Apr 20 07:03:51.719589 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:03:51.719565 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd38a5ffa_c2cb_44b1_823f_627443e996e0.slice/crio-baee1afc9907b804cae40393ea34ded86e185e614bdd9570c1eee0a09d4e568c WatchSource:0}: Error finding container baee1afc9907b804cae40393ea34ded86e185e614bdd9570c1eee0a09d4e568c: Status 404 returned error can't find the container with id baee1afc9907b804cae40393ea34ded86e185e614bdd9570c1eee0a09d4e568c Apr 20 07:03:51.742246 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.742214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qf6r\" (UniqueName: \"kubernetes.io/projected/e21e2393-ce5e-4856-84ba-7b921e943d5f-kube-api-access-8qf6r\") pod \"insights-runtime-extractor-fr8mx\" (UID: \"e21e2393-ce5e-4856-84ba-7b921e943d5f\") " pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:51.869480 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.869453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x528h" event={"ID":"d38a5ffa-c2cb-44b1-823f-627443e996e0","Type":"ContainerStarted","Data":"baee1afc9907b804cae40393ea34ded86e185e614bdd9570c1eee0a09d4e568c"} Apr 20 07:03:51.898930 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:51.898897 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gtfrg" Apr 20 07:03:52.038057 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:52.038021 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fr8mx" Apr 20 07:03:52.044716 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:52.044683 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gtfrg"] Apr 20 07:03:52.048058 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:03:52.048029 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod810eac21_fbc0_4eb1_b7bc_e6cb6a6b694c.slice/crio-ea393513226ae50c948f84de69f0680bbdcec3702793b4b435685987c3adcfaa WatchSource:0}: Error finding container ea393513226ae50c948f84de69f0680bbdcec3702793b4b435685987c3adcfaa: Status 404 returned error can't find the container with id ea393513226ae50c948f84de69f0680bbdcec3702793b4b435685987c3adcfaa Apr 20 07:03:52.205135 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:52.205096 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fr8mx"] Apr 20 07:03:52.209145 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:03:52.209115 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21e2393_ce5e_4856_84ba_7b921e943d5f.slice/crio-44142743071f0de42b2cfc8111ed3745508ad3531c6b5e71cc918911820f4a44 WatchSource:0}: Error finding container 44142743071f0de42b2cfc8111ed3745508ad3531c6b5e71cc918911820f4a44: Status 404 returned error can't find the container with id 44142743071f0de42b2cfc8111ed3745508ad3531c6b5e71cc918911820f4a44 Apr 20 07:03:52.872801 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:52.872761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gtfrg" event={"ID":"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c","Type":"ContainerStarted","Data":"ea393513226ae50c948f84de69f0680bbdcec3702793b4b435685987c3adcfaa"} Apr 20 07:03:52.874168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:52.874139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fr8mx" event={"ID":"e21e2393-ce5e-4856-84ba-7b921e943d5f","Type":"ContainerStarted","Data":"c31309237f5a82f8cee685dfbe44b2569fc7aa03a5855b40296899f82d90b061"} Apr 20 07:03:52.874168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:52.874176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fr8mx" event={"ID":"e21e2393-ce5e-4856-84ba-7b921e943d5f","Type":"ContainerStarted","Data":"44142743071f0de42b2cfc8111ed3745508ad3531c6b5e71cc918911820f4a44"} Apr 20 07:03:53.324595 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.324521 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c798648d8-qlxs5"] Apr 20 07:03:53.339093 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.339070 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.345003 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.344780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 07:03:53.345546 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.345524 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 07:03:53.345697 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.345580 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 07:03:53.345811 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.345710 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 07:03:53.345811 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.345794 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 07:03:53.345925 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.345795 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 07:03:53.346120 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.346084 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 07:03:53.347177 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.347159 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-n4xjx\"" Apr 20 07:03:53.355809 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.355792 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 07:03:53.374807 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.374784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c798648d8-qlxs5"] Apr 20 07:03:53.488160 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-oauth-config\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.488349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9szc\" (UniqueName: \"kubernetes.io/projected/91c62464-c965-4578-b487-d17860e017d3-kube-api-access-q9szc\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.488349 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488290 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-serving-cert\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.488474 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488350 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-console-config\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.488474 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-trusted-ca-bundle\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.488474 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488453 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-oauth-serving-cert\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.488607 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.488483 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-service-ca\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.589353 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.589297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-trusted-ca-bundle\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.589522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.589365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-oauth-serving-cert\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.589522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.589396 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-service-ca\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.589522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.589421 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-oauth-config\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.589944 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.589911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9szc\" (UniqueName: \"kubernetes.io/projected/91c62464-c965-4578-b487-d17860e017d3-kube-api-access-q9szc\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.590067 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.590003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-serving-cert\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.590067 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.590032 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-console-config\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.590189 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.590164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-service-ca\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.590815 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.590755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-console-config\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.590815 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.590755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-oauth-serving-cert\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.591018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.590829 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-trusted-ca-bundle\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.593251 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.593226 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-oauth-config\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.595057 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.595036 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-serving-cert\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.610720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.610692 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9szc\" (UniqueName: \"kubernetes.io/projected/91c62464-c965-4578-b487-d17860e017d3-kube-api-access-q9szc\") pod \"console-5c798648d8-qlxs5\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:53.649659 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:53.649628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:03:54.516700 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.516672 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c798648d8-qlxs5"] Apr 20 07:03:54.595069 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:03:54.595032 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91c62464_c965_4578_b487_d17860e017d3.slice/crio-f6a7b1386b3a5f3cdc3a613b39c85060b4c44a6f29366ee36877f416ae861938 WatchSource:0}: Error finding container f6a7b1386b3a5f3cdc3a613b39c85060b4c44a6f29366ee36877f416ae861938: Status 404 returned error can't find the container with id f6a7b1386b3a5f3cdc3a613b39c85060b4c44a6f29366ee36877f416ae861938 Apr 20 07:03:54.880189 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.880154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x528h" event={"ID":"d38a5ffa-c2cb-44b1-823f-627443e996e0","Type":"ContainerStarted","Data":"8e0a6f32c60740c1996d53a50fbe3c2d7f2719fc708a67a859202aae76c85411"} Apr 20 07:03:54.880570 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.880197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x528h" event={"ID":"d38a5ffa-c2cb-44b1-823f-627443e996e0","Type":"ContainerStarted","Data":"68b1999795f9176ac1385493898f35677b396a4cc3b144ff6747d283e061a6ec"} Apr 20 07:03:54.880570 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.880299 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x528h" Apr 20 07:03:54.881795 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.881770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fr8mx" event={"ID":"e21e2393-ce5e-4856-84ba-7b921e943d5f","Type":"ContainerStarted","Data":"c0e65fe7b11e0d0e2dba19f291faf810941b17ada150a8de545e60c6621625e4"} Apr 20 07:03:54.883012 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.882990 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gtfrg" event={"ID":"810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c","Type":"ContainerStarted","Data":"eef24ac0dacfff9340ae29f53f1afe8d58586e88a2f3054e17e31114e2c43196"} Apr 20 07:03:54.883993 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.883974 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c798648d8-qlxs5" event={"ID":"91c62464-c965-4578-b487-d17860e017d3","Type":"ContainerStarted","Data":"f6a7b1386b3a5f3cdc3a613b39c85060b4c44a6f29366ee36877f416ae861938"} Apr 20 07:03:54.943955 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.943870 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x528h" podStartSLOduration=1.324514314 podStartE2EDuration="3.943850586s" podCreationTimestamp="2026-04-20 07:03:51 +0000 UTC" firstStartedPulling="2026-04-20 07:03:51.721508297 +0000 UTC m=+54.596233136" lastFinishedPulling="2026-04-20 07:03:54.340844555 +0000 UTC m=+57.215569408" observedRunningTime="2026-04-20 07:03:54.943502015 +0000 UTC m=+57.818226877" watchObservedRunningTime="2026-04-20 07:03:54.943850586 +0000 UTC m=+57.818575447" Apr 20 07:03:54.982019 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:54.981974 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gtfrg" podStartSLOduration=1.6909974559999998 podStartE2EDuration="3.981958636s" podCreationTimestamp="2026-04-20 07:03:51 +0000 UTC" firstStartedPulling="2026-04-20 07:03:52.049884741 +0000 UTC m=+54.924609582" lastFinishedPulling="2026-04-20 07:03:54.340845908 +0000 UTC m=+57.215570762" observedRunningTime="2026-04-20 07:03:54.981658105 +0000 UTC m=+57.856382967" watchObservedRunningTime="2026-04-20 07:03:54.981958636 +0000 UTC m=+57.856683527" Apr 20 07:03:56.264308 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.264272 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rjzn8"] Apr 20 07:03:56.290385 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.290359 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.306008 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.305985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 07:03:56.307233 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.307214 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 07:03:56.307632 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.307611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.307707 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.307645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.307707 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.307665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.307806 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.307760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwc7r\" (UniqueName: \"kubernetes.io/projected/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-kube-api-access-dwc7r\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.313989 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.313970 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 07:03:56.314074 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.314004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 07:03:56.314142 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.314114 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 07:03:56.322382 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.322359 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h9ddr\"" Apr 20 07:03:56.323742 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.323722 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rjzn8"] Apr 20 07:03:56.408998 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.408961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.409161 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.409005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.409161 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.409038 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.409161 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.409086 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwc7r\" (UniqueName: \"kubernetes.io/projected/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-kube-api-access-dwc7r\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.410018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.409989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.411739 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.411716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.411989 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.411964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.434744 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.434719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwc7r\" (UniqueName: \"kubernetes.io/projected/5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5-kube-api-access-dwc7r\") pod \"prometheus-operator-5676c8c784-rjzn8\" (UID: \"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.599431 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.599395 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" Apr 20 07:03:56.822515 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:56.822484 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qvqrn" Apr 20 07:03:57.574380 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:57.574341 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rjzn8"] Apr 20 07:03:57.577265 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:03:57.577235 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea4232c_c32a_4b9d_9d64_bcbd9a892ef5.slice/crio-0d721efbc17eabbb74ff433299f334c083358616884703c87b3591c24cdbe68f WatchSource:0}: Error finding container 0d721efbc17eabbb74ff433299f334c083358616884703c87b3591c24cdbe68f: Status 404 returned error can't find the container with id 0d721efbc17eabbb74ff433299f334c083358616884703c87b3591c24cdbe68f Apr 20 07:03:57.893982 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:57.893943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fr8mx" event={"ID":"e21e2393-ce5e-4856-84ba-7b921e943d5f","Type":"ContainerStarted","Data":"5b054df24a04049a2aa0ea305db86ac57f53ff1b5dc76b3575d253d0551224cf"} Apr 20 07:03:57.895063 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:57.895036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" event={"ID":"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5","Type":"ContainerStarted","Data":"0d721efbc17eabbb74ff433299f334c083358616884703c87b3591c24cdbe68f"} Apr 20 07:03:57.896292 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:57.896270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c798648d8-qlxs5" event={"ID":"91c62464-c965-4578-b487-d17860e017d3","Type":"ContainerStarted","Data":"fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0"} Apr 20 07:03:57.941419 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:57.941369 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fr8mx" podStartSLOduration=1.8757253980000002 podStartE2EDuration="6.941355525s" podCreationTimestamp="2026-04-20 07:03:51 +0000 UTC" firstStartedPulling="2026-04-20 07:03:52.363204202 +0000 UTC m=+55.237929045" lastFinishedPulling="2026-04-20 07:03:57.42883432 +0000 UTC m=+60.303559172" observedRunningTime="2026-04-20 07:03:57.938432548 +0000 UTC m=+60.813157409" watchObservedRunningTime="2026-04-20 07:03:57.941355525 +0000 UTC m=+60.816080386" Apr 20 07:03:57.976746 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:57.976703 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c798648d8-qlxs5" podStartSLOduration=2.146867021 podStartE2EDuration="4.976688517s" podCreationTimestamp="2026-04-20 07:03:53 +0000 UTC" firstStartedPulling="2026-04-20 07:03:54.603307687 +0000 UTC m=+57.478032527" lastFinishedPulling="2026-04-20 07:03:57.433129182 +0000 UTC m=+60.307854023" observedRunningTime="2026-04-20 07:03:57.976157287 +0000 UTC m=+60.850882150" watchObservedRunningTime="2026-04-20 07:03:57.976688517 +0000 UTC m=+60.851413380" Apr 20 07:03:59.903466 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:59.903424 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" event={"ID":"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5","Type":"ContainerStarted","Data":"6b4eff98d8c10096efc3b0a4f642cc45df5968322352ae1c9c9845d538df981b"} Apr 20 07:03:59.903466 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:03:59.903466 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" event={"ID":"5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5","Type":"ContainerStarted","Data":"b27b1dd1678e4ed00b50d18726e59dabca492696c584623b5eb8bd5ee78678ac"} Apr 20 07:04:02.354220 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.354184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:04:02.370286 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.370257 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:04:02.377137 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.377116 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3bdbb47e-e79d-4aaa-9671-3899c229b1a2-metrics-certs\") pod \"network-metrics-daemon-xw95j\" (UID: \"3bdbb47e-e79d-4aaa-9671-3899c229b1a2\") " pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:04:02.454670 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.454638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:04:02.456537 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.456518 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mf5sv\"" Apr 20 07:04:02.459407 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.459391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xw95j" Apr 20 07:04:02.459499 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.459466 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:04:02.470800 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.470770 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:04:02.478168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.478149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kffvv\" (UniqueName: \"kubernetes.io/projected/fb646649-ea68-4bb4-87df-a2eed82cc86c-kube-api-access-kffvv\") pod \"network-check-target-sq8g5\" (UID: \"fb646649-ea68-4bb4-87df-a2eed82cc86c\") " pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:04:02.607054 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.606964 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rjzn8" podStartSLOduration=5.134504928 podStartE2EDuration="6.606947397s" podCreationTimestamp="2026-04-20 07:03:56 +0000 UTC" firstStartedPulling="2026-04-20 07:03:57.579173948 +0000 UTC m=+60.453898787" lastFinishedPulling="2026-04-20 07:03:59.0516164 +0000 UTC m=+61.926341256" observedRunningTime="2026-04-20 07:03:59.982788064 +0000 UTC m=+62.857512926" watchObservedRunningTime="2026-04-20 07:04:02.606947397 +0000 UTC m=+65.481672260" Apr 20 07:04:02.607621 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.607604 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc"] Apr 20 07:04:02.612827 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.612811 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.621785 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.621761 2577 status_manager.go:895] "Failed to get status for pod" podUID="ae37004f-672e-407b-a0d3-69378d08f058" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" err="pods \"openshift-state-metrics-9d44df66c-pb6zc\" is forbidden: User \"system:node:ip-10-0-138-178.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-138-178.ec2.internal' and this object" Apr 20 07:04:02.629295 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.629279 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 07:04:02.629569 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.629555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 07:04:02.632344 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.632310 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ftlm4\"" Apr 20 07:04:02.643575 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.643550 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-whfbx"] Apr 20 07:04:02.647076 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.647055 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xw95j"] Apr 20 07:04:02.647199 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.647186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.653228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.653207 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 07:04:02.653463 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.653439 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 07:04:02.653764 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.653736 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 07:04:02.654148 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:02.654129 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bdbb47e_e79d_4aaa_9671_3899c229b1a2.slice/crio-abc09c633f4beabd6ba47c8531850a21cd4ff897c5fb778e652f899117819953 WatchSource:0}: Error finding container abc09c633f4beabd6ba47c8531850a21cd4ff897c5fb778e652f899117819953: Status 404 returned error can't find the container with id abc09c633f4beabd6ba47c8531850a21cd4ff897c5fb778e652f899117819953 Apr 20 07:04:02.655475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.655457 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae37004f-672e-407b-a0d3-69378d08f058-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.655551 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.655485 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37004f-672e-407b-a0d3-69378d08f058-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.655551 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.655523 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae37004f-672e-407b-a0d3-69378d08f058-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.655664 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.655556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4qm2\" (UniqueName: \"kubernetes.io/projected/ae37004f-672e-407b-a0d3-69378d08f058-kube-api-access-v4qm2\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.667435 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.667416 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-qz8vx\"" Apr 20 07:04:02.691196 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.691170 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc"] Apr 20 07:04:02.734543 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.734489 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-whfbx"] Apr 20 07:04:02.756375 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37004f-672e-407b-a0d3-69378d08f058-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.756533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756384 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae37004f-672e-407b-a0d3-69378d08f058-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.756533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756409 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.756533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756433 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.756533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae37004f-672e-407b-a0d3-69378d08f058-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.756533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqkws\" (UniqueName: \"kubernetes.io/projected/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-api-access-kqkws\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.756732 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.756732 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.756732 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.756732 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.756653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4qm2\" (UniqueName: \"kubernetes.io/projected/ae37004f-672e-407b-a0d3-69378d08f058-kube-api-access-v4qm2\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.757216 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.757105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae37004f-672e-407b-a0d3-69378d08f058-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.759123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.759101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/ae37004f-672e-407b-a0d3-69378d08f058-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.759252 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.759234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae37004f-672e-407b-a0d3-69378d08f058-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.774513 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.774485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g4l9z\"" Apr 20 07:04:02.775614 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.775596 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:04:02.780010 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.779990 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4qm2\" (UniqueName: \"kubernetes.io/projected/ae37004f-672e-407b-a0d3-69378d08f058-kube-api-access-v4qm2\") pod \"openshift-state-metrics-9d44df66c-pb6zc\" (UID: \"ae37004f-672e-407b-a0d3-69378d08f058\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.788159 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.788139 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sktgd"] Apr 20 07:04:02.792305 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.792291 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.795778 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.795604 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 07:04:02.799650 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.799626 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f8v22\"" Apr 20 07:04:02.799753 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.799690 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 07:04:02.802450 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.802431 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 07:04:02.857932 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.857855 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-wtmp\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.857932 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.857914 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-sys\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858124 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.857955 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.858124 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.857986 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxvtc\" (UniqueName: \"kubernetes.io/projected/a7829208-2d3d-44ab-b7c6-c79c619a90d2-kube-api-access-pxvtc\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858124 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858032 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-root\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858124 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-tls\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858124 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7829208-2d3d-44ab-b7c6-c79c619a90d2-metrics-client-ca\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqkws\" (UniqueName: \"kubernetes.io/projected/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-api-access-kqkws\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858301 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-accelerators-collector-config\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-textfile\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.858533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.858411 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.859094 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.859005 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.859094 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.859013 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.860033 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.860001 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.860777 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.860755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.861150 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.861132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.912148 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.912112 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xw95j" event={"ID":"3bdbb47e-e79d-4aaa-9671-3899c229b1a2","Type":"ContainerStarted","Data":"abc09c633f4beabd6ba47c8531850a21cd4ff897c5fb778e652f899117819953"} Apr 20 07:04:02.914724 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.914695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqkws\" (UniqueName: \"kubernetes.io/projected/45e5b9d7-edf1-4444-bb94-1c69ca165f1a-kube-api-access-kqkws\") pod \"kube-state-metrics-69db897b98-whfbx\" (UID: \"45e5b9d7-edf1-4444-bb94-1c69ca165f1a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.921481 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.921463 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" Apr 20 07:04:02.959066 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959029 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-wtmp\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959066 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959067 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-sys\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxvtc\" (UniqueName: \"kubernetes.io/projected/a7829208-2d3d-44ab-b7c6-c79c619a90d2-kube-api-access-pxvtc\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-root\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-sys\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959165 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-tls\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959216 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-wtmp\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959224 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7829208-2d3d-44ab-b7c6-c79c619a90d2-metrics-client-ca\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a7829208-2d3d-44ab-b7c6-c79c619a90d2-root\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959698 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-accelerators-collector-config\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.959698 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.959371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-textfile\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.960361 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.960039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" Apr 20 07:04:02.960361 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.960133 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-accelerators-collector-config\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.960361 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.960149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-textfile\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.960604 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.960400 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7829208-2d3d-44ab-b7c6-c79c619a90d2-metrics-client-ca\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.961749 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.961727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-tls\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:02.961806 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:02.961790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7829208-2d3d-44ab-b7c6-c79c619a90d2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:03.010938 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.010864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxvtc\" (UniqueName: \"kubernetes.io/projected/a7829208-2d3d-44ab-b7c6-c79c619a90d2-kube-api-access-pxvtc\") pod \"node-exporter-sktgd\" (UID: \"a7829208-2d3d-44ab-b7c6-c79c619a90d2\") " pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:03.101782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.101748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sktgd" Apr 20 07:04:03.108602 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.108547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sq8g5"] Apr 20 07:04:03.109617 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:03.109596 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb646649_ea68_4bb4_87df_a2eed82cc86c.slice/crio-d7af811f8cd5057899a4d0303e5b05a6250bc9b06b64992a14ea37742f4d060c WatchSource:0}: Error finding container d7af811f8cd5057899a4d0303e5b05a6250bc9b06b64992a14ea37742f4d060c: Status 404 returned error can't find the container with id d7af811f8cd5057899a4d0303e5b05a6250bc9b06b64992a14ea37742f4d060c Apr 20 07:04:03.110565 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:03.110550 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7829208_2d3d_44ab_b7c6_c79c619a90d2.slice/crio-2a5abcd6169f7bb92b1f7cbf2dde962d10afa75f20d77b6c41e0de4b758ce187 WatchSource:0}: Error finding container 2a5abcd6169f7bb92b1f7cbf2dde962d10afa75f20d77b6c41e0de4b758ce187: Status 404 returned error can't find the container with id 2a5abcd6169f7bb92b1f7cbf2dde962d10afa75f20d77b6c41e0de4b758ce187 Apr 20 07:04:03.152377 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.151436 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc"] Apr 20 07:04:03.155508 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:03.155481 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae37004f_672e_407b_a0d3_69378d08f058.slice/crio-add35daf1c5459976b058791750228def1773490a0fc317223059b9a9911f521 WatchSource:0}: Error finding container add35daf1c5459976b058791750228def1773490a0fc317223059b9a9911f521: Status 404 returned error can't find the container with id add35daf1c5459976b058791750228def1773490a0fc317223059b9a9911f521 Apr 20 07:04:03.178906 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.178880 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-whfbx"] Apr 20 07:04:03.183908 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:03.183880 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e5b9d7_edf1_4444_bb94_1c69ca165f1a.slice/crio-20736ae37e6148071ec4d2d31d74fd8930992cdb28875a616173d8069986d1fd WatchSource:0}: Error finding container 20736ae37e6148071ec4d2d31d74fd8930992cdb28875a616173d8069986d1fd: Status 404 returned error can't find the container with id 20736ae37e6148071ec4d2d31d74fd8930992cdb28875a616173d8069986d1fd Apr 20 07:04:03.650622 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.650587 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:04:03.651072 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.650652 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:04:03.656627 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.656599 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:04:03.859446 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.859310 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:04:03.866214 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.865644 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.886257 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.886051 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 07:04:03.888047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.887122 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 07:04:03.888047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.887381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 07:04:03.888047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.887623 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 07:04:03.888047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.887628 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 07:04:03.888047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.887825 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 07:04:03.888047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.887899 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 07:04:03.893803 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.893778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 07:04:03.897840 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.897815 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m6896\"" Apr 20 07:04:03.906847 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.906610 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 07:04:03.928773 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.928710 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xw95j" event={"ID":"3bdbb47e-e79d-4aaa-9671-3899c229b1a2","Type":"ContainerStarted","Data":"6c5a88f905178cf1e48689670e1239db361257f2d248b4adf4edb1ee2479c089"} Apr 20 07:04:03.932563 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.932533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" event={"ID":"ae37004f-672e-407b-a0d3-69378d08f058","Type":"ContainerStarted","Data":"45bd707fbddc501be59fca6fa182ad1da70145373cc4286ddfe16a80af580ed6"} Apr 20 07:04:03.932682 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.932568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" event={"ID":"ae37004f-672e-407b-a0d3-69378d08f058","Type":"ContainerStarted","Data":"de48c485ef566a02759b329e0c116df1873aac55e803f2dfe0d67a8a0862ac2c"} Apr 20 07:04:03.932682 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.932585 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" event={"ID":"ae37004f-672e-407b-a0d3-69378d08f058","Type":"ContainerStarted","Data":"add35daf1c5459976b058791750228def1773490a0fc317223059b9a9911f521"} Apr 20 07:04:03.935300 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.935274 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sktgd" event={"ID":"a7829208-2d3d-44ab-b7c6-c79c619a90d2","Type":"ContainerStarted","Data":"2a5abcd6169f7bb92b1f7cbf2dde962d10afa75f20d77b6c41e0de4b758ce187"} Apr 20 07:04:03.937250 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.937220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" event={"ID":"45e5b9d7-edf1-4444-bb94-1c69ca165f1a","Type":"ContainerStarted","Data":"20736ae37e6148071ec4d2d31d74fd8930992cdb28875a616173d8069986d1fd"} Apr 20 07:04:03.940522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.940491 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sq8g5" event={"ID":"fb646649-ea68-4bb4-87df-a2eed82cc86c","Type":"ContainerStarted","Data":"d7af811f8cd5057899a4d0303e5b05a6250bc9b06b64992a14ea37742f4d060c"} Apr 20 07:04:03.944937 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.944888 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:04:03.946499 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.946470 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:04:03.965194 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965296 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965205 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965296 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965439 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-tls-assets\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965439 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965439 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965430 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965587 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965587 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-volume\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965587 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965544 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-web-config\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965587 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965770 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965770 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965665 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-out\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:03.965770 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:03.965705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f6kc\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-kube-api-access-6f6kc\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.066678 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.066642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.066867 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.066696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.066867 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.066724 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.066867 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.066773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-tls-assets\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.066867 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.066807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.066867 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.066840 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067267 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067216 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067388 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067300 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-volume\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067388 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-web-config\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067500 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067659 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067710 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-out\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.067782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.067748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f6kc\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-kube-api-access-6f6kc\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.081170 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.081103 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.085343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.082251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.087058 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.087033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.089314 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.088777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-volume\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.089314 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.089255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.089520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.089370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-tls-assets\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.091919 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.091679 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.091919 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.091856 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.092629 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.092566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-web-config\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.092834 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.092771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.093767 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.093689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.093904 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.093884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-out\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.150701 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.150636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f6kc\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-kube-api-access-6f6kc\") pod \"alertmanager-main-0\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.177763 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.177646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:04:04.889833 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.889427 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x528h" Apr 20 07:04:04.947478 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.947444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xw95j" event={"ID":"3bdbb47e-e79d-4aaa-9671-3899c229b1a2","Type":"ContainerStarted","Data":"69015fc2af3830bae45db21adac2c73ca41fb5e820718251abc41aeef899e509"} Apr 20 07:04:04.980736 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:04.980688 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xw95j" podStartSLOduration=66.908291365 podStartE2EDuration="1m7.98067174s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:04:02.656195195 +0000 UTC m=+65.530920038" lastFinishedPulling="2026-04-20 07:04:03.728575562 +0000 UTC m=+66.603300413" observedRunningTime="2026-04-20 07:04:04.980086176 +0000 UTC m=+67.854811038" watchObservedRunningTime="2026-04-20 07:04:04.98067174 +0000 UTC m=+67.855396602" Apr 20 07:04:05.229941 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.229893 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:04:05.238556 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:05.238526 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345c627e_322f_49aa_b3a5_eb4fd6b72d47.slice/crio-d5f0c171b3e1d76b2dde918ff36cc75ce977a984d165d7853e5e5cdb6766ef45 WatchSource:0}: Error finding container d5f0c171b3e1d76b2dde918ff36cc75ce977a984d165d7853e5e5cdb6766ef45: Status 404 returned error can't find the container with id d5f0c171b3e1d76b2dde918ff36cc75ce977a984d165d7853e5e5cdb6766ef45 Apr 20 07:04:05.950091 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.950055 2577 generic.go:358] "Generic (PLEG): container finished" podID="a7829208-2d3d-44ab-b7c6-c79c619a90d2" containerID="9e1bc1c92cabedce926e0b178d0d1c3b10b2d7e84be1c3c7f3258088f9d0d367" exitCode=0 Apr 20 07:04:05.950584 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.950142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sktgd" event={"ID":"a7829208-2d3d-44ab-b7c6-c79c619a90d2","Type":"ContainerDied","Data":"9e1bc1c92cabedce926e0b178d0d1c3b10b2d7e84be1c3c7f3258088f9d0d367"} Apr 20 07:04:05.952622 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.952595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" event={"ID":"45e5b9d7-edf1-4444-bb94-1c69ca165f1a","Type":"ContainerStarted","Data":"6b291f37bc3b93c70fc42ce8efa81002cb0103d7ef192dce587082170fe70374"} Apr 20 07:04:05.952716 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.952631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" event={"ID":"45e5b9d7-edf1-4444-bb94-1c69ca165f1a","Type":"ContainerStarted","Data":"7af97547bf4c80d37edd797478fdf782b729bef19638843aa9a00dd7b0bf02ed"} Apr 20 07:04:05.952716 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.952646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" event={"ID":"45e5b9d7-edf1-4444-bb94-1c69ca165f1a","Type":"ContainerStarted","Data":"78f8107b932025a62022c0f95046e9d64f2a08b6048b78ae7ef903a1099fad1a"} Apr 20 07:04:05.953715 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.953690 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"d5f0c171b3e1d76b2dde918ff36cc75ce977a984d165d7853e5e5cdb6766ef45"} Apr 20 07:04:05.957017 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:05.956532 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" event={"ID":"ae37004f-672e-407b-a0d3-69378d08f058","Type":"ContainerStarted","Data":"8b1eeb7c8ae7ee62d414936f3256cf46d4af9dd954ea57c028b4b9750a93ef4b"} Apr 20 07:04:06.019942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.019885 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-pb6zc" podStartSLOduration=2.239641928 podStartE2EDuration="4.019868785s" podCreationTimestamp="2026-04-20 07:04:02 +0000 UTC" firstStartedPulling="2026-04-20 07:04:03.281823472 +0000 UTC m=+66.156548312" lastFinishedPulling="2026-04-20 07:04:05.062050325 +0000 UTC m=+67.936775169" observedRunningTime="2026-04-20 07:04:06.019851921 +0000 UTC m=+68.894576785" watchObservedRunningTime="2026-04-20 07:04:06.019868785 +0000 UTC m=+68.894593646" Apr 20 07:04:06.071970 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.071909 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-whfbx" podStartSLOduration=2.149744654 podStartE2EDuration="4.071880872s" podCreationTimestamp="2026-04-20 07:04:02 +0000 UTC" firstStartedPulling="2026-04-20 07:04:03.186417149 +0000 UTC m=+66.061141989" lastFinishedPulling="2026-04-20 07:04:05.108553328 +0000 UTC m=+67.983278207" observedRunningTime="2026-04-20 07:04:06.065258223 +0000 UTC m=+68.939983086" watchObservedRunningTime="2026-04-20 07:04:06.071880872 +0000 UTC m=+68.946605734" Apr 20 07:04:06.839948 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.839905 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv"] Apr 20 07:04:06.844640 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.844610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:06.849188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.849148 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 07:04:06.849339 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.849224 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5svjt\"" Apr 20 07:04:06.859908 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.859887 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5649bb759-724x4"] Apr 20 07:04:06.864093 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.864073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.866730 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.866709 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 07:04:06.866865 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.866850 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-pj62p\"" Apr 20 07:04:06.867181 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.867154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 07:04:06.867378 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.867362 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 07:04:06.867476 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.867457 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 07:04:06.867593 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.867575 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 07:04:06.867669 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.867653 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4p5ivvldia80q\"" Apr 20 07:04:06.878100 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.878080 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv"] Apr 20 07:04:06.889094 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfbfa478-8079-4924-ab6d-bf9064e82051-metrics-client-ca\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889209 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-tls\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889209 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889209 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-grpc-tls\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889211 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kxxs\" (UniqueName: \"kubernetes.io/projected/dfbfa478-8079-4924-ab6d-bf9064e82051-kube-api-access-5kxxs\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889231 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889272 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.889360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95ccc011-8001-42ed-acd3-f602333f4f36-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ccppv\" (UID: \"95ccc011-8001-42ed-acd3-f602333f4f36\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:06.889518 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.889362 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.900709 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.900679 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5649bb759-724x4"] Apr 20 07:04:06.989861 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.989830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kxxs\" (UniqueName: \"kubernetes.io/projected/dfbfa478-8079-4924-ab6d-bf9064e82051-kube-api-access-5kxxs\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990264 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.989883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990264 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.989987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990264 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.990147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95ccc011-8001-42ed-acd3-f602333f4f36-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ccppv\" (UID: \"95ccc011-8001-42ed-acd3-f602333f4f36\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:06.990467 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.990298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990467 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.990430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfbfa478-8079-4924-ab6d-bf9064e82051-metrics-client-ca\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990568 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.990533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-tls\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990630 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.990606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.990689 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.990676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-grpc-tls\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.992039 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.991993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dfbfa478-8079-4924-ab6d-bf9064e82051-metrics-client-ca\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.993549 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.993526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.993940 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.993864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.993940 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.993918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.994093 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.993966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.994273 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.994250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-thanos-querier-tls\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.994408 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.994393 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/dfbfa478-8079-4924-ab6d-bf9064e82051-secret-grpc-tls\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:06.994781 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:06.994761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/95ccc011-8001-42ed-acd3-f602333f4f36-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-ccppv\" (UID: \"95ccc011-8001-42ed-acd3-f602333f4f36\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:07.007222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.007201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kxxs\" (UniqueName: \"kubernetes.io/projected/dfbfa478-8079-4924-ab6d-bf9064e82051-kube-api-access-5kxxs\") pod \"thanos-querier-5649bb759-724x4\" (UID: \"dfbfa478-8079-4924-ab6d-bf9064e82051\") " pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:07.156807 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.156776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:07.176014 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.175880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:07.329533 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.329511 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv"] Apr 20 07:04:07.331603 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:07.331578 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95ccc011_8001_42ed_acd3_f602333f4f36.slice/crio-175f6296e1706ff78b7a434d69a4ee51abc728946e3b126d9729bc87e57be777 WatchSource:0}: Error finding container 175f6296e1706ff78b7a434d69a4ee51abc728946e3b126d9729bc87e57be777: Status 404 returned error can't find the container with id 175f6296e1706ff78b7a434d69a4ee51abc728946e3b126d9729bc87e57be777 Apr 20 07:04:07.343315 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.343298 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5649bb759-724x4"] Apr 20 07:04:07.344962 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:07.344938 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfbfa478_8079_4924_ab6d_bf9064e82051.slice/crio-a2773a659dfa59e640f6bb22a6fa4ff2b7ebe5d92f28a79e937645deb110ddf5 WatchSource:0}: Error finding container a2773a659dfa59e640f6bb22a6fa4ff2b7ebe5d92f28a79e937645deb110ddf5: Status 404 returned error can't find the container with id a2773a659dfa59e640f6bb22a6fa4ff2b7ebe5d92f28a79e937645deb110ddf5 Apr 20 07:04:07.638155 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.638080 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-687974fc55-tvft4"] Apr 20 07:04:07.671025 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.670995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-oauth-config\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696390 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-serving-cert\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696390 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696284 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-trusted-ca-bundle\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696390 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-service-ca\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696390 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-oauth-serving-cert\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696580 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696407 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-console-config\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.696580 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.696494 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrm5r\" (UniqueName: \"kubernetes.io/projected/96cfd43f-379c-4208-b291-1bace41c1a97-kube-api-access-nrm5r\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.714089 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.714065 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-687974fc55-tvft4"] Apr 20 07:04:07.797151 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-oauth-serving-cert\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.797151 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797158 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-console-config\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.797413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrm5r\" (UniqueName: \"kubernetes.io/projected/96cfd43f-379c-4208-b291-1bace41c1a97-kube-api-access-nrm5r\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.797413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-oauth-config\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.797413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-serving-cert\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.797413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797263 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-trusted-ca-bundle\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.797413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.797286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-service-ca\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.798082 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.798006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-console-config\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.798233 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.798137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-trusted-ca-bundle\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.799798 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.799771 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-oauth-config\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.799873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.799843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-serving-cert\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.807407 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.807386 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-oauth-serving-cert\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.807653 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.807637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-service-ca\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.847390 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.847366 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrm5r\" (UniqueName: \"kubernetes.io/projected/96cfd43f-379c-4208-b291-1bace41c1a97-kube-api-access-nrm5r\") pod \"console-687974fc55-tvft4\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.872743 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.872712 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-58c9f75f5b-9bxz2"] Apr 20 07:04:07.897752 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.897697 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:07.905004 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.904980 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 07:04:07.905097 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.904985 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 07:04:07.905143 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.905092 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ersf0b48e69s\"" Apr 20 07:04:07.905469 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.905454 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 07:04:07.905642 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.905629 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-mzctw\"" Apr 20 07:04:07.910520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.910502 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 07:04:07.922842 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.922820 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58c9f75f5b-9bxz2"] Apr 20 07:04:07.968635 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.968533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sq8g5" event={"ID":"fb646649-ea68-4bb4-87df-a2eed82cc86c","Type":"ContainerStarted","Data":"ee066bef9ad67838b128a27920c10400c074dbc4499a1eaf55403c9b7c05492d"} Apr 20 07:04:07.968789 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.968678 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:04:07.970093 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.970066 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" exitCode=0 Apr 20 07:04:07.970217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.970158 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} Apr 20 07:04:07.971515 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.971490 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" event={"ID":"95ccc011-8001-42ed-acd3-f602333f4f36","Type":"ContainerStarted","Data":"175f6296e1706ff78b7a434d69a4ee51abc728946e3b126d9729bc87e57be777"} Apr 20 07:04:07.974107 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.974085 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sktgd" event={"ID":"a7829208-2d3d-44ab-b7c6-c79c619a90d2","Type":"ContainerStarted","Data":"1273e458f79ee3154bc7fec1cb76ba51966ef1fae09c55adf613cf32229e8424"} Apr 20 07:04:07.974190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.974113 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sktgd" event={"ID":"a7829208-2d3d-44ab-b7c6-c79c619a90d2","Type":"ContainerStarted","Data":"a7fa1abf09c47e3ca84514eb16b2643e031c4576cc64a82149e9a250f5301371"} Apr 20 07:04:07.975292 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.975255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"a2773a659dfa59e640f6bb22a6fa4ff2b7ebe5d92f28a79e937645deb110ddf5"} Apr 20 07:04:07.979567 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.979545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:07.999733 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:07.999498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/22adf421-aa3d-41da-b9d2-894e12e86ca2-metrics-server-audit-profiles\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.000270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.000248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22adf421-aa3d-41da-b9d2-894e12e86ca2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.000964 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.000939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/22adf421-aa3d-41da-b9d2-894e12e86ca2-audit-log\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.001075 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.001031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-secret-metrics-server-tls\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.004834 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.004810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-secret-metrics-server-client-certs\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.004943 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.004847 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6h8q\" (UniqueName: \"kubernetes.io/projected/22adf421-aa3d-41da-b9d2-894e12e86ca2-kube-api-access-b6h8q\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.004943 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.004921 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-client-ca-bundle\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.075944 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.075888 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sktgd" podStartSLOduration=4.127195578 podStartE2EDuration="6.075868931s" podCreationTimestamp="2026-04-20 07:04:02 +0000 UTC" firstStartedPulling="2026-04-20 07:04:03.112271473 +0000 UTC m=+65.986996314" lastFinishedPulling="2026-04-20 07:04:05.060944816 +0000 UTC m=+67.935669667" observedRunningTime="2026-04-20 07:04:08.074052977 +0000 UTC m=+70.948777844" watchObservedRunningTime="2026-04-20 07:04:08.075868931 +0000 UTC m=+70.950593799" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/22adf421-aa3d-41da-b9d2-894e12e86ca2-metrics-server-audit-profiles\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22adf421-aa3d-41da-b9d2-894e12e86ca2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/22adf421-aa3d-41da-b9d2-894e12e86ca2-audit-log\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105826 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-secret-metrics-server-tls\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-secret-metrics-server-client-certs\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6h8q\" (UniqueName: \"kubernetes.io/projected/22adf421-aa3d-41da-b9d2-894e12e86ca2-kube-api-access-b6h8q\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.106190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.105958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-client-ca-bundle\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.107108 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.107083 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/22adf421-aa3d-41da-b9d2-894e12e86ca2-metrics-server-audit-profiles\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.108575 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.108526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/22adf421-aa3d-41da-b9d2-894e12e86ca2-audit-log\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.110158 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.110135 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-secret-metrics-server-tls\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.110636 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.110619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-client-ca-bundle\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.110795 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.110772 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/22adf421-aa3d-41da-b9d2-894e12e86ca2-secret-metrics-server-client-certs\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.119624 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.119578 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6h8q\" (UniqueName: \"kubernetes.io/projected/22adf421-aa3d-41da-b9d2-894e12e86ca2-kube-api-access-b6h8q\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.119902 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.119867 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22adf421-aa3d-41da-b9d2-894e12e86ca2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-58c9f75f5b-9bxz2\" (UID: \"22adf421-aa3d-41da-b9d2-894e12e86ca2\") " pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.197065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.197038 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-687974fc55-tvft4"] Apr 20 07:04:08.207821 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.207524 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:08.273544 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.272671 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sq8g5" podStartSLOduration=67.343017759 podStartE2EDuration="1m11.272649471s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:04:03.111814644 +0000 UTC m=+65.986539488" lastFinishedPulling="2026-04-20 07:04:07.041446361 +0000 UTC m=+69.916171200" observedRunningTime="2026-04-20 07:04:08.26961806 +0000 UTC m=+71.144342923" watchObservedRunningTime="2026-04-20 07:04:08.272649471 +0000 UTC m=+71.147374334" Apr 20 07:04:08.394497 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.394447 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-58c9f75f5b-9bxz2"] Apr 20 07:04:08.398773 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:04:08.398743 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22adf421_aa3d_41da_b9d2_894e12e86ca2.slice/crio-8826985e8cd2379da2593db417bbb800c4724480f03e70559ad2a1825ce892ab WatchSource:0}: Error finding container 8826985e8cd2379da2593db417bbb800c4724480f03e70559ad2a1825ce892ab: Status 404 returned error can't find the container with id 8826985e8cd2379da2593db417bbb800c4724480f03e70559ad2a1825ce892ab Apr 20 07:04:08.979704 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.979627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" event={"ID":"22adf421-aa3d-41da-b9d2-894e12e86ca2","Type":"ContainerStarted","Data":"8826985e8cd2379da2593db417bbb800c4724480f03e70559ad2a1825ce892ab"} Apr 20 07:04:08.983477 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.982779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687974fc55-tvft4" event={"ID":"96cfd43f-379c-4208-b291-1bace41c1a97","Type":"ContainerStarted","Data":"17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f"} Apr 20 07:04:08.983477 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:08.982812 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687974fc55-tvft4" event={"ID":"96cfd43f-379c-4208-b291-1bace41c1a97","Type":"ContainerStarted","Data":"31291dff8a353d85146edba1589dd7e8c1b62d70f663f42b110f05c4021322bc"} Apr 20 07:04:09.026720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:09.026658 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-687974fc55-tvft4" podStartSLOduration=2.026640877 podStartE2EDuration="2.026640877s" podCreationTimestamp="2026-04-20 07:04:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:04:09.026279054 +0000 UTC m=+71.901003917" watchObservedRunningTime="2026-04-20 07:04:09.026640877 +0000 UTC m=+71.901365739" Apr 20 07:04:10.990458 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.990377 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} Apr 20 07:04:10.990458 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.990421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} Apr 20 07:04:10.990458 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.990439 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} Apr 20 07:04:10.991588 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.991562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" event={"ID":"95ccc011-8001-42ed-acd3-f602333f4f36","Type":"ContainerStarted","Data":"8d010350052562452641a1ba978fe3aec582c58b5a9356f3cdc8e7d926599f79"} Apr 20 07:04:10.992347 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.992303 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:10.994360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.993931 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" event={"ID":"22adf421-aa3d-41da-b9d2-894e12e86ca2","Type":"ContainerStarted","Data":"a6500c6d56bcbba625a5c43c85dc526a5dcd89f2625b8815a573ce2e8bd06be3"} Apr 20 07:04:10.997374 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.997219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"cf3e05d53e755d9ae4b251b207a755304986b34a4f7721b81220b4b2690ef3d6"} Apr 20 07:04:10.997374 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.997251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"1cdeb98ac8b96f36cbc9a7624c8fbd30c0deaa9f107ea252203a86c01b9f71e3"} Apr 20 07:04:10.997374 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.997266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"12f651291eb025469e68c819a16ad8cd6a6fa1fda914ed9ff537927c2158d461"} Apr 20 07:04:10.998230 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:10.998208 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" Apr 20 07:04:11.015639 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:11.015587 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-ccppv" podStartSLOduration=1.644319259 podStartE2EDuration="5.015570123s" podCreationTimestamp="2026-04-20 07:04:06 +0000 UTC" firstStartedPulling="2026-04-20 07:04:07.333308803 +0000 UTC m=+70.208033646" lastFinishedPulling="2026-04-20 07:04:10.704559667 +0000 UTC m=+73.579284510" observedRunningTime="2026-04-20 07:04:11.013763629 +0000 UTC m=+73.888488491" watchObservedRunningTime="2026-04-20 07:04:11.015570123 +0000 UTC m=+73.890294988" Apr 20 07:04:11.046602 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:11.046414 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" podStartSLOduration=1.6890755739999999 podStartE2EDuration="4.04639675s" podCreationTimestamp="2026-04-20 07:04:07 +0000 UTC" firstStartedPulling="2026-04-20 07:04:08.401234939 +0000 UTC m=+71.275959782" lastFinishedPulling="2026-04-20 07:04:10.758556103 +0000 UTC m=+73.633280958" observedRunningTime="2026-04-20 07:04:11.042432922 +0000 UTC m=+73.917157784" watchObservedRunningTime="2026-04-20 07:04:11.04639675 +0000 UTC m=+73.921121613" Apr 20 07:04:12.002515 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:12.002486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"218f7c87291ff861cf2dec3b59c792153b53caf8a2c1db108e4f628476f1526c"} Apr 20 07:04:12.005433 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:12.005409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} Apr 20 07:04:12.005544 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:12.005440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} Apr 20 07:04:12.005544 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:12.005455 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerStarted","Data":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} Apr 20 07:04:12.061141 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:12.061099 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.4229184249999998 podStartE2EDuration="9.061084913s" podCreationTimestamp="2026-04-20 07:04:03 +0000 UTC" firstStartedPulling="2026-04-20 07:04:05.243205338 +0000 UTC m=+68.117930193" lastFinishedPulling="2026-04-20 07:04:11.881371827 +0000 UTC m=+74.756096681" observedRunningTime="2026-04-20 07:04:12.058600598 +0000 UTC m=+74.933325460" watchObservedRunningTime="2026-04-20 07:04:12.061084913 +0000 UTC m=+74.935809774" Apr 20 07:04:13.012841 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:13.012798 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"076f8c0ac374d284426b1bbde1fc587fe0af79aee6444d6195bed534eead5105"} Apr 20 07:04:13.012841 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:13.012839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" event={"ID":"dfbfa478-8079-4924-ab6d-bf9064e82051","Type":"ContainerStarted","Data":"1b98220f94f91c94d96f5e8c65d888e1944d5cbb6a8c3329a84ef698f990aaae"} Apr 20 07:04:13.013295 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:13.013224 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:13.086341 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:13.086264 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" podStartSLOduration=2.572348973 podStartE2EDuration="7.086248728s" podCreationTimestamp="2026-04-20 07:04:06 +0000 UTC" firstStartedPulling="2026-04-20 07:04:07.346738391 +0000 UTC m=+70.221463231" lastFinishedPulling="2026-04-20 07:04:11.860638146 +0000 UTC m=+74.735362986" observedRunningTime="2026-04-20 07:04:13.083179019 +0000 UTC m=+75.957903893" watchObservedRunningTime="2026-04-20 07:04:13.086248728 +0000 UTC m=+75.960973591" Apr 20 07:04:17.980456 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:17.980418 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:17.980930 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:17.980510 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:17.985206 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:17.985185 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:18.032826 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:18.032798 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:04:18.151101 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:18.151070 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c798648d8-qlxs5"] Apr 20 07:04:19.022173 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:19.022138 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5649bb759-724x4" Apr 20 07:04:28.208855 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:28.208817 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:28.209450 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:28.208867 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:38.984975 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:38.984934 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sq8g5" Apr 20 07:04:43.170550 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.170505 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c798648d8-qlxs5" podUID="91c62464-c965-4578-b487-d17860e017d3" containerName="console" containerID="cri-o://fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0" gracePeriod=15 Apr 20 07:04:43.461464 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.461435 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c798648d8-qlxs5_91c62464-c965-4578-b487-d17860e017d3/console/0.log" Apr 20 07:04:43.461574 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.461507 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:04:43.511510 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511474 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-serving-cert\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.511685 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511521 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-oauth-config\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.511685 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511538 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9szc\" (UniqueName: \"kubernetes.io/projected/91c62464-c965-4578-b487-d17860e017d3-kube-api-access-q9szc\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.511685 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511561 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-trusted-ca-bundle\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.511685 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511652 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-service-ca\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.511890 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-console-config\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.511890 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511750 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-oauth-serving-cert\") pod \"91c62464-c965-4578-b487-d17860e017d3\" (UID: \"91c62464-c965-4578-b487-d17860e017d3\") " Apr 20 07:04:43.512007 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511983 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:43.512007 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.511994 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:43.512115 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.512062 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-console-config" (OuterVolumeSpecName: "console-config") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:43.512308 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.512288 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:43.513953 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.513934 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:43.514249 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.514228 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:43.514305 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.514240 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c62464-c965-4578-b487-d17860e017d3-kube-api-access-q9szc" (OuterVolumeSpecName: "kube-api-access-q9szc") pod "91c62464-c965-4578-b487-d17860e017d3" (UID: "91c62464-c965-4578-b487-d17860e017d3"). InnerVolumeSpecName "kube-api-access-q9szc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:43.613163 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613128 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-serving-cert\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:43.613163 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613155 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91c62464-c965-4578-b487-d17860e017d3-console-oauth-config\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:43.613163 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613166 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9szc\" (UniqueName: \"kubernetes.io/projected/91c62464-c965-4578-b487-d17860e017d3-kube-api-access-q9szc\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:43.613418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613175 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-trusted-ca-bundle\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:43.613418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613186 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-service-ca\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:43.613418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613195 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-console-config\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:43.613418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:43.613203 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91c62464-c965-4578-b487-d17860e017d3-oauth-serving-cert\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:04:44.106433 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.106398 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c798648d8-qlxs5_91c62464-c965-4578-b487-d17860e017d3/console/0.log" Apr 20 07:04:44.106623 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.106447 2577 generic.go:358] "Generic (PLEG): container finished" podID="91c62464-c965-4578-b487-d17860e017d3" containerID="fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0" exitCode=2 Apr 20 07:04:44.106623 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.106532 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c798648d8-qlxs5" Apr 20 07:04:44.106623 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.106545 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c798648d8-qlxs5" event={"ID":"91c62464-c965-4578-b487-d17860e017d3","Type":"ContainerDied","Data":"fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0"} Apr 20 07:04:44.106623 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.106582 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c798648d8-qlxs5" event={"ID":"91c62464-c965-4578-b487-d17860e017d3","Type":"ContainerDied","Data":"f6a7b1386b3a5f3cdc3a613b39c85060b4c44a6f29366ee36877f416ae861938"} Apr 20 07:04:44.106623 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.106598 2577 scope.go:117] "RemoveContainer" containerID="fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0" Apr 20 07:04:44.114494 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.114476 2577 scope.go:117] "RemoveContainer" containerID="fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0" Apr 20 07:04:44.114775 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:04:44.114756 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0\": container with ID starting with fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0 not found: ID does not exist" containerID="fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0" Apr 20 07:04:44.114837 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.114786 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0"} err="failed to get container status \"fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0\": rpc error: code = NotFound desc = could not find container \"fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0\": container with ID starting with fe97ce0051b8fd2a814688ada71ab3d9cb836d50149b7043a285ff8d1ee51df0 not found: ID does not exist" Apr 20 07:04:44.128124 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.128098 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c798648d8-qlxs5"] Apr 20 07:04:44.141176 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:44.141155 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c798648d8-qlxs5"] Apr 20 07:04:45.643447 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:45.643413 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c62464-c965-4578-b487-d17860e017d3" path="/var/lib/kubelet/pods/91c62464-c965-4578-b487-d17860e017d3/volumes" Apr 20 07:04:48.213448 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:48.213419 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:04:48.217263 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:04:48.217237 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-58c9f75f5b-9bxz2" Apr 20 07:07:57.593665 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:07:57.593634 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 07:09:08.393583 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.393546 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:09:08.394188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.394024 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="alertmanager" containerID="cri-o://c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" gracePeriod=120 Apr 20 07:09:08.394188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.394070 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-metric" containerID="cri-o://684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" gracePeriod=120 Apr 20 07:09:08.394188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.394104 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-web" containerID="cri-o://fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" gracePeriod=120 Apr 20 07:09:08.394188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.394133 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="prom-label-proxy" containerID="cri-o://c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" gracePeriod=120 Apr 20 07:09:08.394188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.394155 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy" containerID="cri-o://a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" gracePeriod=120 Apr 20 07:09:08.394509 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.394413 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="config-reloader" containerID="cri-o://4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" gracePeriod=120 Apr 20 07:09:08.650422 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.650356 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.738693 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.738664 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-metrics-client-ca\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.738871 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.738704 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f6kc\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-kube-api-access-6f6kc\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.738871 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.738722 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.738871 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.738753 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-trusted-ca-bundle\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739033 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.738929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-web\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739033 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.738967 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-out\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739157 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739130 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:09:08.739211 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739134 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:09:08.739211 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739141 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-tls-assets\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739310 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739213 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-volume\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739310 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739256 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-main-tls\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739310 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739297 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-metric\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739364 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-web-config\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739411 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-main-db\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739446 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-cluster-tls-config\") pod \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\" (UID: \"345c627e-322f-49aa-b3a5-eb4fd6b72d47\") " Apr 20 07:09:08.739710 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739683 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-metrics-client-ca\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.739710 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.739704 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.740433 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.740381 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:09:08.742046 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.742004 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-volume" (OuterVolumeSpecName: "config-volume") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.742046 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.742020 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-kube-api-access-6f6kc" (OuterVolumeSpecName: "kube-api-access-6f6kc") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "kube-api-access-6f6kc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:09:08.742209 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.742173 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:09:08.742262 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.742245 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-out" (OuterVolumeSpecName: "config-out") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:09:08.742554 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.742531 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.743107 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.743083 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.743498 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.743477 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.743744 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.743728 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.746776 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.746734 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.753147 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.753128 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-web-config" (OuterVolumeSpecName: "web-config") pod "345c627e-322f-49aa-b3a5-eb4fd6b72d47" (UID: "345c627e-322f-49aa-b3a5-eb4fd6b72d47"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:08.834491 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834460 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" exitCode=0 Apr 20 07:09:08.834491 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834485 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" exitCode=0 Apr 20 07:09:08.834491 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834491 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" exitCode=0 Apr 20 07:09:08.834491 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834496 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" exitCode=0 Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834502 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" exitCode=0 Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834507 2577 generic.go:358] "Generic (PLEG): container finished" podID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" exitCode=0 Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834563 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834583 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834613 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834624 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"345c627e-322f-49aa-b3a5-eb4fd6b72d47","Type":"ContainerDied","Data":"d5f0c171b3e1d76b2dde918ff36cc75ce977a984d165d7853e5e5cdb6766ef45"} Apr 20 07:09:08.834702 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.834647 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.840047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840025 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-web-config\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840047 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840046 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-alertmanager-main-db\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840057 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-cluster-tls-config\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840066 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6f6kc\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-kube-api-access-6f6kc\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840075 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840084 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840093 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-out\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840101 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/345c627e-322f-49aa-b3a5-eb4fd6b72d47-tls-assets\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840109 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-config-volume\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840117 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-main-tls\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.840225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.840125 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/345c627e-322f-49aa-b3a5-eb4fd6b72d47-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:08.842080 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.842063 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.848957 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.848939 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.855573 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.855555 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.858766 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.858744 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:09:08.862478 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.862455 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.864709 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.864689 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:09:08.869246 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.869232 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.875782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.875764 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.882065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882049 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.882298 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.882281 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.882360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882307 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} err="failed to get container status \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" Apr 20 07:09:08.882360 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882347 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.882593 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.882576 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.882632 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882599 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} err="failed to get container status \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" Apr 20 07:09:08.882632 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882616 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.882818 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.882803 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.882856 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882824 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} err="failed to get container status \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" Apr 20 07:09:08.882856 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.882840 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.883038 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.883023 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.883073 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883042 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} err="failed to get container status \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" Apr 20 07:09:08.883073 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883056 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.883256 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.883239 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.883294 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883261 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} err="failed to get container status \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" Apr 20 07:09:08.883294 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883273 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.883502 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.883483 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.883545 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883506 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} err="failed to get container status \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" Apr 20 07:09:08.883545 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883520 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.883749 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:08.883733 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.883796 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883752 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} err="failed to get container status \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" Apr 20 07:09:08.883796 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883766 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.883971 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883955 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} err="failed to get container status \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" Apr 20 07:09:08.884011 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.883971 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.884176 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884158 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} err="failed to get container status \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" Apr 20 07:09:08.884217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884177 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.884398 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884382 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} err="failed to get container status \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" Apr 20 07:09:08.884456 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884400 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.884631 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884614 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} err="failed to get container status \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" Apr 20 07:09:08.884679 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884632 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.884850 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884832 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} err="failed to get container status \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" Apr 20 07:09:08.884895 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.884850 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.885059 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885034 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} err="failed to get container status \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" Apr 20 07:09:08.885059 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885059 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.885263 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885246 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} err="failed to get container status \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" Apr 20 07:09:08.885310 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885265 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.885500 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885485 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} err="failed to get container status \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" Apr 20 07:09:08.885560 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885502 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.885671 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885656 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} err="failed to get container status \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" Apr 20 07:09:08.885725 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885671 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.885861 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885847 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} err="failed to get container status \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" Apr 20 07:09:08.885908 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.885861 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.886040 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886025 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} err="failed to get container status \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" Apr 20 07:09:08.886080 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886041 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.886207 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886192 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} err="failed to get container status \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" Apr 20 07:09:08.886244 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886207 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.886428 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886408 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} err="failed to get container status \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" Apr 20 07:09:08.886428 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886427 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.886715 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886699 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} err="failed to get container status \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" Apr 20 07:09:08.886758 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886716 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.886929 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886915 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} err="failed to get container status \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" Apr 20 07:09:08.886976 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.886929 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.887137 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887119 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} err="failed to get container status \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" Apr 20 07:09:08.887183 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887138 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.887313 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887299 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} err="failed to get container status \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" Apr 20 07:09:08.887313 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887313 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.887513 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887498 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} err="failed to get container status \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" Apr 20 07:09:08.887562 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887513 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.887716 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887693 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} err="failed to get container status \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" Apr 20 07:09:08.887782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887718 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.887938 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887922 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} err="failed to get container status \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" Apr 20 07:09:08.887986 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.887939 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.888126 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888108 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} err="failed to get container status \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" Apr 20 07:09:08.888194 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888127 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.888369 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888348 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} err="failed to get container status \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" Apr 20 07:09:08.888369 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888369 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.888631 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888614 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} err="failed to get container status \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" Apr 20 07:09:08.888683 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888633 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.888849 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888828 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} err="failed to get container status \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" Apr 20 07:09:08.888849 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.888849 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.889099 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889068 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} err="failed to get container status \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" Apr 20 07:09:08.889099 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889089 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.889448 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889411 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} err="failed to get container status \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" Apr 20 07:09:08.889554 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889449 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.889675 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889656 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} err="failed to get container status \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" Apr 20 07:09:08.889720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889677 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.889914 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889894 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} err="failed to get container status \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" Apr 20 07:09:08.889956 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.889916 2577 scope.go:117] "RemoveContainer" containerID="c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1" Apr 20 07:09:08.890116 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890100 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1"} err="failed to get container status \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": rpc error: code = NotFound desc = could not find container \"c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1\": container with ID starting with c3d5076487ca03a85e8bb868772c1ee8e48a3db522bf043d7d7f59bd3444dbc1 not found: ID does not exist" Apr 20 07:09:08.890166 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890116 2577 scope.go:117] "RemoveContainer" containerID="684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c" Apr 20 07:09:08.890296 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890278 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c"} err="failed to get container status \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": rpc error: code = NotFound desc = could not find container \"684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c\": container with ID starting with 684660db980e935bdebb63351e3ebd8668e025ceaed9acf60bd640789e3b633c not found: ID does not exist" Apr 20 07:09:08.890378 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890299 2577 scope.go:117] "RemoveContainer" containerID="a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c" Apr 20 07:09:08.890571 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890550 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c"} err="failed to get container status \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": rpc error: code = NotFound desc = could not find container \"a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c\": container with ID starting with a83461c6944c6c1b3b0a8b7b278a71751e760b47e9c73f37ec44a8cd999fee0c not found: ID does not exist" Apr 20 07:09:08.890641 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890573 2577 scope.go:117] "RemoveContainer" containerID="fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f" Apr 20 07:09:08.890803 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890786 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f"} err="failed to get container status \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": rpc error: code = NotFound desc = could not find container \"fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f\": container with ID starting with fb71ed01c20e0ec7e478ccf206de3be33bca4f762aa866f2c235f7aa123b489f not found: ID does not exist" Apr 20 07:09:08.890865 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.890804 2577 scope.go:117] "RemoveContainer" containerID="4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c" Apr 20 07:09:08.891021 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.891004 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c"} err="failed to get container status \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": rpc error: code = NotFound desc = could not find container \"4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c\": container with ID starting with 4b06f7504726f919556348409325d9ca114b27eeedda314b02e4a8e9003fbf1c not found: ID does not exist" Apr 20 07:09:08.891070 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.891024 2577 scope.go:117] "RemoveContainer" containerID="c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9" Apr 20 07:09:08.891239 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.891222 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9"} err="failed to get container status \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": rpc error: code = NotFound desc = could not find container \"c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9\": container with ID starting with c170cb055e5d6e9fa4a3583ca9551d0bfbbcc7ac61bf2bcdb7baa35c8a41c6c9 not found: ID does not exist" Apr 20 07:09:08.891285 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.891240 2577 scope.go:117] "RemoveContainer" containerID="846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f" Apr 20 07:09:08.891470 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.891455 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f"} err="failed to get container status \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": rpc error: code = NotFound desc = could not find container \"846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f\": container with ID starting with 846de324ba0749c41599c6b3c894ebab944d18353366b5eac3157c521f6e897f not found: ID does not exist" Apr 20 07:09:08.901005 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.900961 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:09:08.901217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901206 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-metric" Apr 20 07:09:08.901254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901218 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-metric" Apr 20 07:09:08.901254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901230 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="init-config-reloader" Apr 20 07:09:08.901254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901235 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="init-config-reloader" Apr 20 07:09:08.901254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901242 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="prom-label-proxy" Apr 20 07:09:08.901254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901248 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="prom-label-proxy" Apr 20 07:09:08.901254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901255 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91c62464-c965-4578-b487-d17860e017d3" containerName="console" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901261 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c62464-c965-4578-b487-d17860e017d3" containerName="console" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901268 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901273 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901282 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="config-reloader" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901288 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="config-reloader" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901295 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-web" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901300 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-web" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901310 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="alertmanager" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901314 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="alertmanager" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901382 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="alertmanager" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901389 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-web" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901395 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy-metric" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901401 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="91c62464-c965-4578-b487-d17860e017d3" containerName="console" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901406 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="config-reloader" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901412 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="prom-label-proxy" Apr 20 07:09:08.901451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.901419 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" containerName="kube-rbac-proxy" Apr 20 07:09:08.907306 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.907287 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.909988 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.909911 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 07:09:08.909988 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.909917 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 07:09:08.909988 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.909982 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 07:09:08.910217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.909983 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 07:09:08.910217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.910053 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 07:09:08.910217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.910161 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 07:09:08.910217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.910173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 07:09:08.910476 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.910269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-m6896\"" Apr 20 07:09:08.910476 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.910443 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 07:09:08.919576 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.917818 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 07:09:08.923217 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.923197 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:09:08.940631 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1bac0ecf-cc26-4474-8e34-da870cf45d49-config-out\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940737 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bac0ecf-cc26-4474-8e34-da870cf45d49-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940737 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940737 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940698 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-web-config\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940737 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1bac0ecf-cc26-4474-8e34-da870cf45d49-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940743 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1bac0ecf-cc26-4474-8e34-da870cf45d49-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bac0ecf-cc26-4474-8e34-da870cf45d49-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.940873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.941062 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940886 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-config-volume\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:08.941062 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:08.940920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtcs5\" (UniqueName: \"kubernetes.io/projected/1bac0ecf-cc26-4474-8e34-da870cf45d49-kube-api-access-dtcs5\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042028 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bac0ecf-cc26-4474-8e34-da870cf45d49-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042028 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042224 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042224 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042224 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042401 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042235 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-config-volume\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042401 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtcs5\" (UniqueName: \"kubernetes.io/projected/1bac0ecf-cc26-4474-8e34-da870cf45d49-kube-api-access-dtcs5\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042401 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042343 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1bac0ecf-cc26-4474-8e34-da870cf45d49-config-out\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bac0ecf-cc26-4474-8e34-da870cf45d49-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-web-config\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1bac0ecf-cc26-4474-8e34-da870cf45d49-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.042552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042509 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1bac0ecf-cc26-4474-8e34-da870cf45d49-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.043086 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.042970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bac0ecf-cc26-4474-8e34-da870cf45d49-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.045340 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.045340 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045284 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-config-volume\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.045340 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045526 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1bac0ecf-cc26-4474-8e34-da870cf45d49-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-web-config\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1bac0ecf-cc26-4474-8e34-da870cf45d49-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bac0ecf-cc26-4474-8e34-da870cf45d49-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.045951 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.046132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.046026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1bac0ecf-cc26-4474-8e34-da870cf45d49-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.047207 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.047190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1bac0ecf-cc26-4474-8e34-da870cf45d49-config-out\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.051738 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.051720 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtcs5\" (UniqueName: \"kubernetes.io/projected/1bac0ecf-cc26-4474-8e34-da870cf45d49-kube-api-access-dtcs5\") pod \"alertmanager-main-0\" (UID: \"1bac0ecf-cc26-4474-8e34-da870cf45d49\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.221309 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.221219 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:09:09.352486 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.352456 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:09:09.356739 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:09:09.356714 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bac0ecf_cc26_4474_8e34_da870cf45d49.slice/crio-b3e0b77e5a1cf80b8822e999b49fbacbb63be86fd782660cdfdf4c886cb14ef5 WatchSource:0}: Error finding container b3e0b77e5a1cf80b8822e999b49fbacbb63be86fd782660cdfdf4c886cb14ef5: Status 404 returned error can't find the container with id b3e0b77e5a1cf80b8822e999b49fbacbb63be86fd782660cdfdf4c886cb14ef5 Apr 20 07:09:09.358620 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.358603 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:09:09.643726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.643691 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="345c627e-322f-49aa-b3a5-eb4fd6b72d47" path="/var/lib/kubelet/pods/345c627e-322f-49aa-b3a5-eb4fd6b72d47/volumes" Apr 20 07:09:09.838072 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.838038 2577 generic.go:358] "Generic (PLEG): container finished" podID="1bac0ecf-cc26-4474-8e34-da870cf45d49" containerID="03a9300b687e0feae8cc8e0fdee696ecb8dcbdc5b4e7bd9de70570251151a456" exitCode=0 Apr 20 07:09:09.838246 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.838128 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerDied","Data":"03a9300b687e0feae8cc8e0fdee696ecb8dcbdc5b4e7bd9de70570251151a456"} Apr 20 07:09:09.838246 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:09.838166 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"b3e0b77e5a1cf80b8822e999b49fbacbb63be86fd782660cdfdf4c886cb14ef5"} Apr 20 07:09:10.844775 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.844741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"65197d1dca6e9377e0d666bff9210ae881051e0e6390f6c828802e7c90e68cc5"} Apr 20 07:09:10.844775 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.844781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"f58306739b877d7caf03eab25413756b9386f423d7886df3ed72011065a71481"} Apr 20 07:09:10.845168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.844795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"a4448b8434abd00cfbb9433582be08b99a2ae7667e64861b31cfbc8067313a02"} Apr 20 07:09:10.845168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.844807 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"503e6424780ebfb6d6e2513d7cd318a82ecac00aeb1f9d33f763bded42e59687"} Apr 20 07:09:10.845168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.844819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"bec9d185d23d118159231f197f566f6da960e2ef45721b0f388d404944c91b00"} Apr 20 07:09:10.845168 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.844830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1bac0ecf-cc26-4474-8e34-da870cf45d49","Type":"ContainerStarted","Data":"ee1a81a1893d9475522fc3364a780072ed227483ea24f4247277ab759d3cd07f"} Apr 20 07:09:10.874430 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:10.874370 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.874352086 podStartE2EDuration="2.874352086s" podCreationTimestamp="2026-04-20 07:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:09:10.872065319 +0000 UTC m=+373.746790181" watchObservedRunningTime="2026-04-20 07:09:10.874352086 +0000 UTC m=+373.749076949" Apr 20 07:09:12.381718 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.381681 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j"] Apr 20 07:09:12.385255 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.385234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.387857 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.387839 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 07:09:12.388132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.388116 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 07:09:12.388132 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.388128 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 07:09:12.388270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.388160 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 07:09:12.388270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.388215 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 07:09:12.388580 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.388565 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-sbzjb\"" Apr 20 07:09:12.396213 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.396196 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 07:09:12.399163 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.399143 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j"] Apr 20 07:09:12.472616 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472578 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-metrics-client-ca\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-federate-client-tls\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472711 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472764 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6229\" (UniqueName: \"kubernetes.io/projected/432d1ef5-166a-4475-a17c-1cdb5b648cdc-kube-api-access-n6229\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472943 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472819 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-serving-certs-ca-bundle\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472943 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472856 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-secret-telemeter-client\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.472943 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.472876 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-telemeter-client-tls\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573513 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573513 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573515 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6229\" (UniqueName: \"kubernetes.io/projected/432d1ef5-166a-4475-a17c-1cdb5b648cdc-kube-api-access-n6229\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573729 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-serving-certs-ca-bundle\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573729 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573558 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-secret-telemeter-client\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573729 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573691 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-telemeter-client-tls\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573883 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573771 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-metrics-client-ca\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573883 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573810 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-federate-client-tls\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.573883 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.573839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.574582 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.574495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-serving-certs-ca-bundle\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.574582 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.574495 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-metrics-client-ca\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.574768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.574593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/432d1ef5-166a-4475-a17c-1cdb5b648cdc-telemeter-trusted-ca-bundle\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.576377 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.576348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.576471 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.576445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-secret-telemeter-client\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.576507 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.576488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-federate-client-tls\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.576575 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.576557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/432d1ef5-166a-4475-a17c-1cdb5b648cdc-telemeter-client-tls\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.581632 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.581610 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6229\" (UniqueName: \"kubernetes.io/projected/432d1ef5-166a-4475-a17c-1cdb5b648cdc-kube-api-access-n6229\") pod \"telemeter-client-d86b6fdf5-hbb4j\" (UID: \"432d1ef5-166a-4475-a17c-1cdb5b648cdc\") " pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.696902 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.696815 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" Apr 20 07:09:12.828183 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.828152 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j"] Apr 20 07:09:12.831997 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:09:12.831967 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod432d1ef5_166a_4475_a17c_1cdb5b648cdc.slice/crio-5b27de5eb88e26dbb9fff974474e8a589f502705956f65c29bf5f33a3901b508 WatchSource:0}: Error finding container 5b27de5eb88e26dbb9fff974474e8a589f502705956f65c29bf5f33a3901b508: Status 404 returned error can't find the container with id 5b27de5eb88e26dbb9fff974474e8a589f502705956f65c29bf5f33a3901b508 Apr 20 07:09:12.852412 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:12.852385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" event={"ID":"432d1ef5-166a-4475-a17c-1cdb5b648cdc","Type":"ContainerStarted","Data":"5b27de5eb88e26dbb9fff974474e8a589f502705956f65c29bf5f33a3901b508"} Apr 20 07:09:13.711359 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.711119 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:09:13.716482 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.716460 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.719199 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.719176 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jpqxm\"" Apr 20 07:09:13.720034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.719872 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 07:09:13.720034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.719928 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 07:09:13.720034 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.719988 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 07:09:13.720220 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.720205 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 07:09:13.720443 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.720425 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 07:09:13.720668 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.720652 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 07:09:13.720777 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.720655 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 07:09:13.720934 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.720919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 07:09:13.721115 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.721099 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 07:09:13.721216 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.721154 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 07:09:13.721526 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.721507 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-652nvdta61qo3\"" Apr 20 07:09:13.724970 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.724883 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 07:09:13.726045 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.726025 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 07:09:13.740805 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.740775 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:09:13.785585 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6b57012-62e2-4cd4-9106-f4d360314967-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.785768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.785768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.785768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.785768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.785768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785763 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l4fj\" (UniqueName: \"kubernetes.io/projected/d6b57012-62e2-4cd4-9106-f4d360314967-kube-api-access-5l4fj\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785859 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-config\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.785965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786048 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786177 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6b57012-62e2-4cd4-9106-f4d360314967-config-out\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786258 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-web-config\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.786538 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.786289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887335 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887293 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6b57012-62e2-4cd4-9106-f4d360314967-config-out\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-web-config\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6b57012-62e2-4cd4-9106-f4d360314967-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887780 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887780 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887780 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887780 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.887780 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.887631 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5l4fj\" (UniqueName: \"kubernetes.io/projected/d6b57012-62e2-4cd4-9106-f4d360314967-kube-api-access-5l4fj\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888480 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888440 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888614 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888614 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888614 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888566 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888614 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888592 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-config\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888844 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888634 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888844 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888844 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.888844 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888826 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.889041 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.888843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.891122 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.891101 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d6b57012-62e2-4cd4-9106-f4d360314967-config-out\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.892173 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.891501 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.892173 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.891798 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.892173 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.891941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.892429 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.892175 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.892579 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.892514 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.893099 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.893067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.893190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.893154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.893190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.893160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6b57012-62e2-4cd4-9106-f4d360314967-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.893190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.893173 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d6b57012-62e2-4cd4-9106-f4d360314967-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.893490 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.893458 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-config\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.894211 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.894188 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.894763 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.894732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-web-config\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.895091 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.895070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d6b57012-62e2-4cd4-9106-f4d360314967-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:13.901526 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:13.901507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l4fj\" (UniqueName: \"kubernetes.io/projected/d6b57012-62e2-4cd4-9106-f4d360314967-kube-api-access-5l4fj\") pod \"prometheus-k8s-0\" (UID: \"d6b57012-62e2-4cd4-9106-f4d360314967\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:14.033700 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:14.033616 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:14.179690 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:14.179584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 07:09:14.182680 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:09:14.182648 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b57012_62e2_4cd4_9106_f4d360314967.slice/crio-c7f863ac0a09fd856363ef49491ead3d0cba06757be6038c06ef9cbec3d5502f WatchSource:0}: Error finding container c7f863ac0a09fd856363ef49491ead3d0cba06757be6038c06ef9cbec3d5502f: Status 404 returned error can't find the container with id c7f863ac0a09fd856363ef49491ead3d0cba06757be6038c06ef9cbec3d5502f Apr 20 07:09:14.860119 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:14.860091 2577 generic.go:358] "Generic (PLEG): container finished" podID="d6b57012-62e2-4cd4-9106-f4d360314967" containerID="0297dadec94942c87e4f261ce987fab3ee3b658aa276c6b65e8236972a589d9a" exitCode=0 Apr 20 07:09:14.860507 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:14.860182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerDied","Data":"0297dadec94942c87e4f261ce987fab3ee3b658aa276c6b65e8236972a589d9a"} Apr 20 07:09:14.860507 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:14.860220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"c7f863ac0a09fd856363ef49491ead3d0cba06757be6038c06ef9cbec3d5502f"} Apr 20 07:09:14.861796 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:14.861770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" event={"ID":"432d1ef5-166a-4475-a17c-1cdb5b648cdc","Type":"ContainerStarted","Data":"e9e4409c8ceba12af7b39b91551bb30b5a0d0a3d67736d2555f42fa85b45670e"} Apr 20 07:09:15.866413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:15.866372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" event={"ID":"432d1ef5-166a-4475-a17c-1cdb5b648cdc","Type":"ContainerStarted","Data":"3bc5cd7b710c0fc86ed60368bcc0c82d41d23b47a736d53ef942cc9c2bd04537"} Apr 20 07:09:15.866413 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:15.866414 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" event={"ID":"432d1ef5-166a-4475-a17c-1cdb5b648cdc","Type":"ContainerStarted","Data":"152828146ac6be575c668cfb0fc79d1f07d8c2db0d24fae174191fc362624404"} Apr 20 07:09:15.890872 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:15.890729 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-d86b6fdf5-hbb4j" podStartSLOduration=1.93489382 podStartE2EDuration="3.890710219s" podCreationTimestamp="2026-04-20 07:09:12 +0000 UTC" firstStartedPulling="2026-04-20 07:09:12.833864264 +0000 UTC m=+375.708589105" lastFinishedPulling="2026-04-20 07:09:14.789680657 +0000 UTC m=+377.664405504" observedRunningTime="2026-04-20 07:09:15.889900601 +0000 UTC m=+378.764625491" watchObservedRunningTime="2026-04-20 07:09:15.890710219 +0000 UTC m=+378.765435082" Apr 20 07:09:17.878814 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:17.878657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"d9380acd79159fc1b02f166667cfd2b7b9e31c0a202e0fa5cb2291b27897b9ad"} Apr 20 07:09:17.878814 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:17.878704 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"3d15eb50f349eec2bae6ef0c58c0e3707451611120a1f14beeedc206567e0cf1"} Apr 20 07:09:17.878814 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:17.878717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"9a3d000641d733cc83e1d86fb0c79918dc76a0f79eb1fab5c92895d546ed8821"} Apr 20 07:09:18.885343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:18.885298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"2f2ca5e5125f33fff3f4f8c219b9ff99bfdc8e9a4807b38c34f96b47916bc1e0"} Apr 20 07:09:18.885343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:18.885349 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"040d7425de0e84f37c87f6bbf2543bad2619f7693ecd975dd5a5bd82029ea1bb"} Apr 20 07:09:18.885768 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:18.885360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d6b57012-62e2-4cd4-9106-f4d360314967","Type":"ContainerStarted","Data":"5ae9b2bcf20dd7f94725eb843f298cccc3f0ef5edce6226c02b45cfef12d7e02"} Apr 20 07:09:18.914726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:18.914678 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.108737161 podStartE2EDuration="5.914659424s" podCreationTimestamp="2026-04-20 07:09:13 +0000 UTC" firstStartedPulling="2026-04-20 07:09:14.861583923 +0000 UTC m=+377.736308770" lastFinishedPulling="2026-04-20 07:09:17.667506188 +0000 UTC m=+380.542231033" observedRunningTime="2026-04-20 07:09:18.912584782 +0000 UTC m=+381.787309644" watchObservedRunningTime="2026-04-20 07:09:18.914659424 +0000 UTC m=+381.789384286" Apr 20 07:09:19.034086 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:19.034057 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:09:28.320354 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:28.320301 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-687974fc55-tvft4"] Apr 20 07:09:53.344180 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.344115 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-687974fc55-tvft4" podUID="96cfd43f-379c-4208-b291-1bace41c1a97" containerName="console" containerID="cri-o://17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f" gracePeriod=15 Apr 20 07:09:53.608813 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.608790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-687974fc55-tvft4_96cfd43f-379c-4208-b291-1bace41c1a97/console/0.log" Apr 20 07:09:53.608913 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.608851 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:09:53.719655 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719618 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-oauth-config\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.719831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719667 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-oauth-serving-cert\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.719831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719690 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-service-ca\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.719831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719722 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-trusted-ca-bundle\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.719831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719736 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-console-config\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.719831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719751 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrm5r\" (UniqueName: \"kubernetes.io/projected/96cfd43f-379c-4208-b291-1bace41c1a97-kube-api-access-nrm5r\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.719831 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.719804 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-serving-cert\") pod \"96cfd43f-379c-4208-b291-1bace41c1a97\" (UID: \"96cfd43f-379c-4208-b291-1bace41c1a97\") " Apr 20 07:09:53.720116 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.720050 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:09:53.720188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.720146 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-service-ca" (OuterVolumeSpecName: "service-ca") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:09:53.720316 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.720296 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:09:53.720417 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.720294 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-console-config" (OuterVolumeSpecName: "console-config") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:09:53.722063 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.722039 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:53.722063 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.722043 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96cfd43f-379c-4208-b291-1bace41c1a97-kube-api-access-nrm5r" (OuterVolumeSpecName: "kube-api-access-nrm5r") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "kube-api-access-nrm5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:09:53.722211 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.722080 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96cfd43f-379c-4208-b291-1bace41c1a97" (UID: "96cfd43f-379c-4208-b291-1bace41c1a97"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:09:53.821181 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821143 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-oauth-serving-cert\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:53.821181 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821175 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-service-ca\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:53.821181 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821184 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-trusted-ca-bundle\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:53.821418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821216 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96cfd43f-379c-4208-b291-1bace41c1a97-console-config\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:53.821418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821226 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrm5r\" (UniqueName: \"kubernetes.io/projected/96cfd43f-379c-4208-b291-1bace41c1a97-kube-api-access-nrm5r\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:53.821418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821235 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-serving-cert\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:53.821418 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:53.821245 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96cfd43f-379c-4208-b291-1bace41c1a97-console-oauth-config\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:09:54.003046 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.002963 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-687974fc55-tvft4_96cfd43f-379c-4208-b291-1bace41c1a97/console/0.log" Apr 20 07:09:54.003046 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.003006 2577 generic.go:358] "Generic (PLEG): container finished" podID="96cfd43f-379c-4208-b291-1bace41c1a97" containerID="17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f" exitCode=2 Apr 20 07:09:54.003228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.003077 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-687974fc55-tvft4" Apr 20 07:09:54.003228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.003097 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687974fc55-tvft4" event={"ID":"96cfd43f-379c-4208-b291-1bace41c1a97","Type":"ContainerDied","Data":"17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f"} Apr 20 07:09:54.003228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.003135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-687974fc55-tvft4" event={"ID":"96cfd43f-379c-4208-b291-1bace41c1a97","Type":"ContainerDied","Data":"31291dff8a353d85146edba1589dd7e8c1b62d70f663f42b110f05c4021322bc"} Apr 20 07:09:54.003228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.003150 2577 scope.go:117] "RemoveContainer" containerID="17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f" Apr 20 07:09:54.012128 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.012108 2577 scope.go:117] "RemoveContainer" containerID="17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f" Apr 20 07:09:54.012415 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:09:54.012394 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f\": container with ID starting with 17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f not found: ID does not exist" containerID="17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f" Apr 20 07:09:54.012477 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.012429 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f"} err="failed to get container status \"17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f\": rpc error: code = NotFound desc = could not find container \"17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f\": container with ID starting with 17e1e42f7bf90b3053d45e8f6b21ec32de132dbf2ecff7f61dff04ba1c67103f not found: ID does not exist" Apr 20 07:09:54.024385 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.024356 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-687974fc55-tvft4"] Apr 20 07:09:54.028204 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:54.028179 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-687974fc55-tvft4"] Apr 20 07:09:55.643337 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:09:55.643295 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96cfd43f-379c-4208-b291-1bace41c1a97" path="/var/lib/kubelet/pods/96cfd43f-379c-4208-b291-1bace41c1a97/volumes" Apr 20 07:10:14.034123 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:10:14.034046 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:10:14.053453 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:10:14.053431 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:10:14.075075 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:10:14.075052 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 07:11:11.180999 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.180964 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-87wvx"] Apr 20 07:11:11.181490 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.181284 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96cfd43f-379c-4208-b291-1bace41c1a97" containerName="console" Apr 20 07:11:11.181490 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.181295 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="96cfd43f-379c-4208-b291-1bace41c1a97" containerName="console" Apr 20 07:11:11.181490 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.181393 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="96cfd43f-379c-4208-b291-1bace41c1a97" containerName="console" Apr 20 07:11:11.184127 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.184111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.189650 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.189628 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 07:11:11.198251 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.198227 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-87wvx"] Apr 20 07:11:11.249583 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.249561 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e578f271-ddce-475b-ba48-2e815f1f88f9-kubelet-config\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.249710 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.249614 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e578f271-ddce-475b-ba48-2e815f1f88f9-dbus\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.249710 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.249633 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e578f271-ddce-475b-ba48-2e815f1f88f9-original-pull-secret\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.350532 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.350501 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e578f271-ddce-475b-ba48-2e815f1f88f9-kubelet-config\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.350684 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.350563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e578f271-ddce-475b-ba48-2e815f1f88f9-dbus\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.350684 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.350585 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e578f271-ddce-475b-ba48-2e815f1f88f9-original-pull-secret\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.350684 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.350628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e578f271-ddce-475b-ba48-2e815f1f88f9-kubelet-config\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.350843 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.350792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e578f271-ddce-475b-ba48-2e815f1f88f9-dbus\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.352886 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.352861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e578f271-ddce-475b-ba48-2e815f1f88f9-original-pull-secret\") pod \"global-pull-secret-syncer-87wvx\" (UID: \"e578f271-ddce-475b-ba48-2e815f1f88f9\") " pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.493843 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.493748 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87wvx" Apr 20 07:11:11.619749 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:11.619667 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-87wvx"] Apr 20 07:11:11.622646 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:11:11.622603 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode578f271_ddce_475b_ba48_2e815f1f88f9.slice/crio-5d20eb7050f1087d5a79209bd8895d899d6d86dcc90b171e0d3702f19e2a90be WatchSource:0}: Error finding container 5d20eb7050f1087d5a79209bd8895d899d6d86dcc90b171e0d3702f19e2a90be: Status 404 returned error can't find the container with id 5d20eb7050f1087d5a79209bd8895d899d6d86dcc90b171e0d3702f19e2a90be Apr 20 07:11:12.236405 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:12.236360 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-87wvx" event={"ID":"e578f271-ddce-475b-ba48-2e815f1f88f9","Type":"ContainerStarted","Data":"5d20eb7050f1087d5a79209bd8895d899d6d86dcc90b171e0d3702f19e2a90be"} Apr 20 07:11:17.253073 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:17.252963 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-87wvx" event={"ID":"e578f271-ddce-475b-ba48-2e815f1f88f9","Type":"ContainerStarted","Data":"ffb10646d4a26a78a9640685943c59d121c5559d061cd1353df9a09e0f5d76c4"} Apr 20 07:11:17.273006 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:11:17.272961 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-87wvx" podStartSLOduration=0.953071836 podStartE2EDuration="6.272946619s" podCreationTimestamp="2026-04-20 07:11:11 +0000 UTC" firstStartedPulling="2026-04-20 07:11:11.62449547 +0000 UTC m=+494.499220311" lastFinishedPulling="2026-04-20 07:11:16.944370249 +0000 UTC m=+499.819095094" observedRunningTime="2026-04-20 07:11:17.271705395 +0000 UTC m=+500.146430255" watchObservedRunningTime="2026-04-20 07:11:17.272946619 +0000 UTC m=+500.147671481" Apr 20 07:12:09.645609 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.645579 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-vwvpk"] Apr 20 07:12:09.648639 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.648622 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.651951 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.651934 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bkmzf\"" Apr 20 07:12:09.652660 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.652639 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 07:12:09.652748 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.652736 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 07:12:09.669442 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.669408 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-vwvpk"] Apr 20 07:12:09.740451 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.740415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56g5\" (UniqueName: \"kubernetes.io/projected/569a60d7-d923-42aa-bc00-e9eae76a6566-kube-api-access-v56g5\") pod \"cert-manager-webhook-597b96b99b-vwvpk\" (UID: \"569a60d7-d923-42aa-bc00-e9eae76a6566\") " pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.740606 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.740558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/569a60d7-d923-42aa-bc00-e9eae76a6566-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-vwvpk\" (UID: \"569a60d7-d923-42aa-bc00-e9eae76a6566\") " pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.841631 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.841601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/569a60d7-d923-42aa-bc00-e9eae76a6566-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-vwvpk\" (UID: \"569a60d7-d923-42aa-bc00-e9eae76a6566\") " pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.841758 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.841662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v56g5\" (UniqueName: \"kubernetes.io/projected/569a60d7-d923-42aa-bc00-e9eae76a6566-kube-api-access-v56g5\") pod \"cert-manager-webhook-597b96b99b-vwvpk\" (UID: \"569a60d7-d923-42aa-bc00-e9eae76a6566\") " pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.850919 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.850884 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/569a60d7-d923-42aa-bc00-e9eae76a6566-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-vwvpk\" (UID: \"569a60d7-d923-42aa-bc00-e9eae76a6566\") " pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.851039 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.850966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56g5\" (UniqueName: \"kubernetes.io/projected/569a60d7-d923-42aa-bc00-e9eae76a6566-kube-api-access-v56g5\") pod \"cert-manager-webhook-597b96b99b-vwvpk\" (UID: \"569a60d7-d923-42aa-bc00-e9eae76a6566\") " pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:09.976087 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:09.975999 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:10.095876 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:10.095843 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-vwvpk"] Apr 20 07:12:10.098762 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:12:10.098734 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod569a60d7_d923_42aa_bc00_e9eae76a6566.slice/crio-51c2a43a97a71a36963bdffa71b4ba3a3c2845d4a80c9c4c9b06190542e6aeec WatchSource:0}: Error finding container 51c2a43a97a71a36963bdffa71b4ba3a3c2845d4a80c9c4c9b06190542e6aeec: Status 404 returned error can't find the container with id 51c2a43a97a71a36963bdffa71b4ba3a3c2845d4a80c9c4c9b06190542e6aeec Apr 20 07:12:10.415917 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:10.415883 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" event={"ID":"569a60d7-d923-42aa-bc00-e9eae76a6566","Type":"ContainerStarted","Data":"51c2a43a97a71a36963bdffa71b4ba3a3c2845d4a80c9c4c9b06190542e6aeec"} Apr 20 07:12:13.427582 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:13.427554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" event={"ID":"569a60d7-d923-42aa-bc00-e9eae76a6566","Type":"ContainerStarted","Data":"0ad35ddacacc0cdf9e97fbffd400fb84ff4866c41fe3ca0a21fe2a892c4be037"} Apr 20 07:12:13.427926 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:13.427680 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:13.445026 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:13.444862 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" podStartSLOduration=1.1956460309999999 podStartE2EDuration="4.444841935s" podCreationTimestamp="2026-04-20 07:12:09 +0000 UTC" firstStartedPulling="2026-04-20 07:12:10.100541821 +0000 UTC m=+552.975266660" lastFinishedPulling="2026-04-20 07:12:13.349737721 +0000 UTC m=+556.224462564" observedRunningTime="2026-04-20 07:12:13.443627793 +0000 UTC m=+556.318352657" watchObservedRunningTime="2026-04-20 07:12:13.444841935 +0000 UTC m=+556.319566800" Apr 20 07:12:19.433475 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:19.433442 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-vwvpk" Apr 20 07:12:24.098859 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.098827 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2"] Apr 20 07:12:24.102306 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.102286 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.105496 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.105481 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-mhlp7\"" Apr 20 07:12:24.106492 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.106471 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:12:24.106609 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.106526 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 07:12:24.114090 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.114070 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2"] Apr 20 07:12:24.175657 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.175629 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpnmd\" (UniqueName: \"kubernetes.io/projected/31f5c69d-ef1d-4760-a5c5-b55a243c2ba5-kube-api-access-fpnmd\") pod \"openshift-lws-operator-bfc7f696d-25nr2\" (UID: \"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.175797 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.175683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31f5c69d-ef1d-4760-a5c5-b55a243c2ba5-tmp\") pod \"openshift-lws-operator-bfc7f696d-25nr2\" (UID: \"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.276762 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.276723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpnmd\" (UniqueName: \"kubernetes.io/projected/31f5c69d-ef1d-4760-a5c5-b55a243c2ba5-kube-api-access-fpnmd\") pod \"openshift-lws-operator-bfc7f696d-25nr2\" (UID: \"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.276953 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.276781 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31f5c69d-ef1d-4760-a5c5-b55a243c2ba5-tmp\") pod \"openshift-lws-operator-bfc7f696d-25nr2\" (UID: \"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.277126 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.277108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31f5c69d-ef1d-4760-a5c5-b55a243c2ba5-tmp\") pod \"openshift-lws-operator-bfc7f696d-25nr2\" (UID: \"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.286954 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.286936 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpnmd\" (UniqueName: \"kubernetes.io/projected/31f5c69d-ef1d-4760-a5c5-b55a243c2ba5-kube-api-access-fpnmd\") pod \"openshift-lws-operator-bfc7f696d-25nr2\" (UID: \"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.411088 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.410998 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" Apr 20 07:12:24.532758 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:24.532724 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2"] Apr 20 07:12:24.535639 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:12:24.535612 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f5c69d_ef1d_4760_a5c5_b55a243c2ba5.slice/crio-ed02d7e56136a0c0fbbf33c1cbdd23dd258ac3ba9066ce21fdcf1075d7645c56 WatchSource:0}: Error finding container ed02d7e56136a0c0fbbf33c1cbdd23dd258ac3ba9066ce21fdcf1075d7645c56: Status 404 returned error can't find the container with id ed02d7e56136a0c0fbbf33c1cbdd23dd258ac3ba9066ce21fdcf1075d7645c56 Apr 20 07:12:25.471753 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:25.471711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" event={"ID":"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5","Type":"ContainerStarted","Data":"ed02d7e56136a0c0fbbf33c1cbdd23dd258ac3ba9066ce21fdcf1075d7645c56"} Apr 20 07:12:27.479815 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:27.479779 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" event={"ID":"31f5c69d-ef1d-4760-a5c5-b55a243c2ba5","Type":"ContainerStarted","Data":"4917832d2f1a965ba171782c499d021426b16cd9d8d867e4d2484bba746acde6"} Apr 20 07:12:27.496558 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:27.496504 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-25nr2" podStartSLOduration=1.304399872 podStartE2EDuration="3.49648816s" podCreationTimestamp="2026-04-20 07:12:24 +0000 UTC" firstStartedPulling="2026-04-20 07:12:24.537083474 +0000 UTC m=+567.411808314" lastFinishedPulling="2026-04-20 07:12:26.729171758 +0000 UTC m=+569.603896602" observedRunningTime="2026-04-20 07:12:27.496139637 +0000 UTC m=+570.370864498" watchObservedRunningTime="2026-04-20 07:12:27.49648816 +0000 UTC m=+570.371213021" Apr 20 07:12:34.668437 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.668402 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d"] Apr 20 07:12:34.671853 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.671837 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.674712 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.674677 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 07:12:34.675678 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.675660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 07:12:34.675678 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.675668 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-nn82n\"" Apr 20 07:12:34.675825 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.675708 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 07:12:34.686665 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.686643 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d"] Apr 20 07:12:34.752927 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.752896 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-metrics-cert\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.753110 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.752935 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-cert\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.753110 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.752956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttp24\" (UniqueName: \"kubernetes.io/projected/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-kube-api-access-ttp24\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.753110 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.753085 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-manager-config\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.854174 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.854144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-manager-config\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.854312 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.854218 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-metrics-cert\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.854312 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.854253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-cert\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.854312 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.854285 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttp24\" (UniqueName: \"kubernetes.io/projected/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-kube-api-access-ttp24\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.854830 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.854807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-manager-config\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.856646 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.856625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-metrics-cert\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.856745 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.856671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-cert\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.880781 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.880751 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttp24\" (UniqueName: \"kubernetes.io/projected/755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca-kube-api-access-ttp24\") pod \"lws-controller-manager-7fd89bcbc4-zqq4d\" (UID: \"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:34.981919 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:34.981834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:35.114560 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:35.114513 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d"] Apr 20 07:12:35.117014 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:12:35.116986 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod755fe1ea_f3fe_4f2f_8d1e_bc985dd3a8ca.slice/crio-fc9bb99f3cfb1bbdd41083b88c1a15df59432bc1158d30bbcb71161d076a5a4c WatchSource:0}: Error finding container fc9bb99f3cfb1bbdd41083b88c1a15df59432bc1158d30bbcb71161d076a5a4c: Status 404 returned error can't find the container with id fc9bb99f3cfb1bbdd41083b88c1a15df59432bc1158d30bbcb71161d076a5a4c Apr 20 07:12:35.504740 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:35.504700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" event={"ID":"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca","Type":"ContainerStarted","Data":"fc9bb99f3cfb1bbdd41083b88c1a15df59432bc1158d30bbcb71161d076a5a4c"} Apr 20 07:12:37.512504 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:37.512472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" event={"ID":"755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca","Type":"ContainerStarted","Data":"5c17959ad8a06ea6698d33f6e9b3705560abe4796c22f297522b7913d8b0eddf"} Apr 20 07:12:37.512842 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:37.512519 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:37.531340 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:37.531281 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" podStartSLOduration=1.32735281 podStartE2EDuration="3.531265895s" podCreationTimestamp="2026-04-20 07:12:34 +0000 UTC" firstStartedPulling="2026-04-20 07:12:35.119150558 +0000 UTC m=+577.993875402" lastFinishedPulling="2026-04-20 07:12:37.323063638 +0000 UTC m=+580.197788487" observedRunningTime="2026-04-20 07:12:37.530107618 +0000 UTC m=+580.404832479" watchObservedRunningTime="2026-04-20 07:12:37.531265895 +0000 UTC m=+580.405990757" Apr 20 07:12:43.983591 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.983556 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt"] Apr 20 07:12:43.992023 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.992006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:43.995628 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.995607 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 07:12:43.996156 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.996139 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-tcmrx\"" Apr 20 07:12:43.996211 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.996154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 07:12:43.996525 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.996512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 07:12:43.996941 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:43.996927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 07:12:44.007802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.007778 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt"] Apr 20 07:12:44.032843 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.032803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d1d7a4d-2b39-44a1-9240-848919e4bd62-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.032983 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.032865 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgm4v\" (UniqueName: \"kubernetes.io/projected/5d1d7a4d-2b39-44a1-9240-848919e4bd62-kube-api-access-pgm4v\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.032983 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.032951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d1d7a4d-2b39-44a1-9240-848919e4bd62-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.134186 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.134151 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d1d7a4d-2b39-44a1-9240-848919e4bd62-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.134395 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.134198 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgm4v\" (UniqueName: \"kubernetes.io/projected/5d1d7a4d-2b39-44a1-9240-848919e4bd62-kube-api-access-pgm4v\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.134395 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.134242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d1d7a4d-2b39-44a1-9240-848919e4bd62-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.136687 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.136660 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5d1d7a4d-2b39-44a1-9240-848919e4bd62-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.136817 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.136794 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5d1d7a4d-2b39-44a1-9240-848919e4bd62-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.148365 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.148338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgm4v\" (UniqueName: \"kubernetes.io/projected/5d1d7a4d-2b39-44a1-9240-848919e4bd62-kube-api-access-pgm4v\") pod \"opendatahub-operator-controller-manager-6d65d76454-blhjt\" (UID: \"5d1d7a4d-2b39-44a1-9240-848919e4bd62\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.302598 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.302501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:44.441842 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.441817 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt"] Apr 20 07:12:44.444434 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:12:44.444403 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d1d7a4d_2b39_44a1_9240_848919e4bd62.slice/crio-8975301ffa0e4cdb368cbc4f99002e1f04de81a9f2fe3d3f6f4f4956bdb65de7 WatchSource:0}: Error finding container 8975301ffa0e4cdb368cbc4f99002e1f04de81a9f2fe3d3f6f4f4956bdb65de7: Status 404 returned error can't find the container with id 8975301ffa0e4cdb368cbc4f99002e1f04de81a9f2fe3d3f6f4f4956bdb65de7 Apr 20 07:12:44.533439 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:44.533398 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" event={"ID":"5d1d7a4d-2b39-44a1-9240-848919e4bd62","Type":"ContainerStarted","Data":"8975301ffa0e4cdb368cbc4f99002e1f04de81a9f2fe3d3f6f4f4956bdb65de7"} Apr 20 07:12:47.545190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:47.545157 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" event={"ID":"5d1d7a4d-2b39-44a1-9240-848919e4bd62","Type":"ContainerStarted","Data":"cbd32460310ba74af27fcab6512c487213c5f5189a76aa519fd45f82cf1d5b94"} Apr 20 07:12:47.545571 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:47.545284 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:12:47.570584 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:47.570541 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" podStartSLOduration=2.085406918 podStartE2EDuration="4.570527957s" podCreationTimestamp="2026-04-20 07:12:43 +0000 UTC" firstStartedPulling="2026-04-20 07:12:44.446217835 +0000 UTC m=+587.320942678" lastFinishedPulling="2026-04-20 07:12:46.931338864 +0000 UTC m=+589.806063717" observedRunningTime="2026-04-20 07:12:47.569092153 +0000 UTC m=+590.443817016" watchObservedRunningTime="2026-04-20 07:12:47.570527957 +0000 UTC m=+590.445252819" Apr 20 07:12:48.518242 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:48.518214 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-zqq4d" Apr 20 07:12:58.551771 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:12:58.551738 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-blhjt" Apr 20 07:13:04.255015 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.254970 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2"] Apr 20 07:13:04.259500 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.259480 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.262600 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.262576 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 07:13:04.262710 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.262673 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 07:13:04.262775 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.262757 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 07:13:04.263407 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.263392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 07:13:04.263619 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.263606 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-fqh7x\"" Apr 20 07:13:04.293550 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.293523 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2"] Apr 20 07:13:04.426680 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.426648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-tls-certs\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.426680 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.426684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-tmp\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.426864 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.426818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d55xg\" (UniqueName: \"kubernetes.io/projected/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-kube-api-access-d55xg\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.527823 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.527735 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d55xg\" (UniqueName: \"kubernetes.io/projected/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-kube-api-access-d55xg\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.527823 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.527779 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-tls-certs\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.527823 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.527798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-tmp\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.530098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.530075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-tmp\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.530462 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.530441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-tls-certs\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.540889 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.540866 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d55xg\" (UniqueName: \"kubernetes.io/projected/0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9-kube-api-access-d55xg\") pod \"kube-auth-proxy-6bbb6d54d8-b2nj2\" (UID: \"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9\") " pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.569035 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.569004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" Apr 20 07:13:04.695462 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:04.695435 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2"] Apr 20 07:13:04.696542 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:13:04.696513 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b9d6fbe_d32b_4e92_9bf1_19eea864a8d9.slice/crio-e9cb8b78614961656401b7aedf4fff65f59a9f61ca4971ffd0bccdd913021443 WatchSource:0}: Error finding container e9cb8b78614961656401b7aedf4fff65f59a9f61ca4971ffd0bccdd913021443: Status 404 returned error can't find the container with id e9cb8b78614961656401b7aedf4fff65f59a9f61ca4971ffd0bccdd913021443 Apr 20 07:13:05.607516 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:05.607413 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" event={"ID":"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9","Type":"ContainerStarted","Data":"e9cb8b78614961656401b7aedf4fff65f59a9f61ca4971ffd0bccdd913021443"} Apr 20 07:13:08.619775 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:08.619740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" event={"ID":"0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9","Type":"ContainerStarted","Data":"bfec87d457fb5f5dab9cc40c55254f3c602765a0d46b11c9ed82846a8deb4152"} Apr 20 07:13:08.651560 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:13:08.651506 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6bbb6d54d8-b2nj2" podStartSLOduration=1.089044175 podStartE2EDuration="4.651489277s" podCreationTimestamp="2026-04-20 07:13:04 +0000 UTC" firstStartedPulling="2026-04-20 07:13:04.698502528 +0000 UTC m=+607.573227372" lastFinishedPulling="2026-04-20 07:13:08.260947634 +0000 UTC m=+611.135672474" observedRunningTime="2026-04-20 07:13:08.649290713 +0000 UTC m=+611.524015575" watchObservedRunningTime="2026-04-20 07:13:08.651489277 +0000 UTC m=+611.526214140" Apr 20 07:14:39.789966 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.789931 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b"] Apr 20 07:14:39.793206 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.793189 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:39.795607 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.795585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:14:39.795720 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.795588 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4sdjw\"" Apr 20 07:14:39.796432 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.796412 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 07:14:39.796518 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.796418 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:14:39.796518 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.796419 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 07:14:39.801235 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.801207 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b"] Apr 20 07:14:39.959383 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.959349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9n9n\" (UniqueName: \"kubernetes.io/projected/d5c54bf7-12b2-4457-829e-37a45bfe4574-kube-api-access-g9n9n\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:39.959549 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.959402 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5c54bf7-12b2-4457-829e-37a45bfe4574-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:39.959549 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:39.959425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c54bf7-12b2-4457-829e-37a45bfe4574-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.060939 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.060855 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9n9n\" (UniqueName: \"kubernetes.io/projected/d5c54bf7-12b2-4457-829e-37a45bfe4574-kube-api-access-g9n9n\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.060939 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.060907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5c54bf7-12b2-4457-829e-37a45bfe4574-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.060939 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.060936 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c54bf7-12b2-4457-829e-37a45bfe4574-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.061635 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.061611 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d5c54bf7-12b2-4457-829e-37a45bfe4574-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.063615 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.063593 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d5c54bf7-12b2-4457-829e-37a45bfe4574-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.069460 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.069436 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9n9n\" (UniqueName: \"kubernetes.io/projected/d5c54bf7-12b2-4457-829e-37a45bfe4574-kube-api-access-g9n9n\") pod \"kuadrant-console-plugin-6cb54b5c86-tz82b\" (UID: \"d5c54bf7-12b2-4457-829e-37a45bfe4574\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.102924 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.102889 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" Apr 20 07:14:40.224873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.224834 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b"] Apr 20 07:14:40.227062 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:14:40.227030 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5c54bf7_12b2_4457_829e_37a45bfe4574.slice/crio-6258b19620122c883b651f26b7f6006de845719bfe974ac30179c3bc5765cb68 WatchSource:0}: Error finding container 6258b19620122c883b651f26b7f6006de845719bfe974ac30179c3bc5765cb68: Status 404 returned error can't find the container with id 6258b19620122c883b651f26b7f6006de845719bfe974ac30179c3bc5765cb68 Apr 20 07:14:40.228339 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.228301 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:14:40.921691 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:14:40.921642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" event={"ID":"d5c54bf7-12b2-4457-829e-37a45bfe4574","Type":"ContainerStarted","Data":"6258b19620122c883b651f26b7f6006de845719bfe974ac30179c3bc5765cb68"} Apr 20 07:15:05.019776 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:05.019740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" event={"ID":"d5c54bf7-12b2-4457-829e-37a45bfe4574","Type":"ContainerStarted","Data":"66faaa9793e521ce05e13dde811fd9c3296e40b6b2d926a38818952a394708f6"} Apr 20 07:15:05.043375 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:05.043312 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tz82b" podStartSLOduration=2.112774996 podStartE2EDuration="26.043296511s" podCreationTimestamp="2026-04-20 07:14:39 +0000 UTC" firstStartedPulling="2026-04-20 07:14:40.228505257 +0000 UTC m=+703.103230111" lastFinishedPulling="2026-04-20 07:15:04.159026783 +0000 UTC m=+727.033751626" observedRunningTime="2026-04-20 07:15:05.041791018 +0000 UTC m=+727.916515880" watchObservedRunningTime="2026-04-20 07:15:05.043296511 +0000 UTC m=+727.918021374" Apr 20 07:15:24.493499 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.493462 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:15:24.512956 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.512921 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:15:24.513105 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.513040 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.515806 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.515778 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 07:15:24.534065 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.534029 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:15:24.570674 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.570644 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/df21d0cd-de21-4cda-85b8-7aa133837268-config-file\") pod \"limitador-limitador-78c99df468-k9j9p\" (UID: \"df21d0cd-de21-4cda-85b8-7aa133837268\") " pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.570798 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.570767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mtx\" (UniqueName: \"kubernetes.io/projected/df21d0cd-de21-4cda-85b8-7aa133837268-kube-api-access-b4mtx\") pod \"limitador-limitador-78c99df468-k9j9p\" (UID: \"df21d0cd-de21-4cda-85b8-7aa133837268\") " pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.671646 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.671607 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mtx\" (UniqueName: \"kubernetes.io/projected/df21d0cd-de21-4cda-85b8-7aa133837268-kube-api-access-b4mtx\") pod \"limitador-limitador-78c99df468-k9j9p\" (UID: \"df21d0cd-de21-4cda-85b8-7aa133837268\") " pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.671791 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.671671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/df21d0cd-de21-4cda-85b8-7aa133837268-config-file\") pod \"limitador-limitador-78c99df468-k9j9p\" (UID: \"df21d0cd-de21-4cda-85b8-7aa133837268\") " pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.672226 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.672208 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/df21d0cd-de21-4cda-85b8-7aa133837268-config-file\") pod \"limitador-limitador-78c99df468-k9j9p\" (UID: \"df21d0cd-de21-4cda-85b8-7aa133837268\") " pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.680489 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.680467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mtx\" (UniqueName: \"kubernetes.io/projected/df21d0cd-de21-4cda-85b8-7aa133837268-kube-api-access-b4mtx\") pod \"limitador-limitador-78c99df468-k9j9p\" (UID: \"df21d0cd-de21-4cda-85b8-7aa133837268\") " pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.823881 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.823801 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:24.960822 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:24.960797 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:15:24.964293 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:15:24.964262 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf21d0cd_de21_4cda_85b8_7aa133837268.slice/crio-c23bdd19fe790658532ccd92857fc06ee2d758b56e1887f3e74f583610ea6f4f WatchSource:0}: Error finding container c23bdd19fe790658532ccd92857fc06ee2d758b56e1887f3e74f583610ea6f4f: Status 404 returned error can't find the container with id c23bdd19fe790658532ccd92857fc06ee2d758b56e1887f3e74f583610ea6f4f Apr 20 07:15:25.095588 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.095549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" event={"ID":"df21d0cd-de21-4cda-85b8-7aa133837268","Type":"ContainerStarted","Data":"c23bdd19fe790658532ccd92857fc06ee2d758b56e1887f3e74f583610ea6f4f"} Apr 20 07:15:25.176712 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.176680 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-z7mcm"] Apr 20 07:15:25.180960 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.180944 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:15:25.183710 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.183691 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6clll\"" Apr 20 07:15:25.188125 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.188103 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-z7mcm"] Apr 20 07:15:25.277262 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.277232 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwzf8\" (UniqueName: \"kubernetes.io/projected/31cc4262-7db2-4244-bdc3-0860ad56b440-kube-api-access-hwzf8\") pod \"authorino-7498df8756-z7mcm\" (UID: \"31cc4262-7db2-4244-bdc3-0860ad56b440\") " pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:15:25.378004 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.377924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwzf8\" (UniqueName: \"kubernetes.io/projected/31cc4262-7db2-4244-bdc3-0860ad56b440-kube-api-access-hwzf8\") pod \"authorino-7498df8756-z7mcm\" (UID: \"31cc4262-7db2-4244-bdc3-0860ad56b440\") " pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:15:25.387002 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.386941 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwzf8\" (UniqueName: \"kubernetes.io/projected/31cc4262-7db2-4244-bdc3-0860ad56b440-kube-api-access-hwzf8\") pod \"authorino-7498df8756-z7mcm\" (UID: \"31cc4262-7db2-4244-bdc3-0860ad56b440\") " pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:15:25.492141 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.491698 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:15:25.660267 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:25.660181 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-z7mcm"] Apr 20 07:15:25.663230 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:15:25.663201 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31cc4262_7db2_4244_bdc3_0860ad56b440.slice/crio-4c0a9328a4e0be0676fdf1207aedab8a67db685bbb9ca31d788e1d116ec0b828 WatchSource:0}: Error finding container 4c0a9328a4e0be0676fdf1207aedab8a67db685bbb9ca31d788e1d116ec0b828: Status 404 returned error can't find the container with id 4c0a9328a4e0be0676fdf1207aedab8a67db685bbb9ca31d788e1d116ec0b828 Apr 20 07:15:26.100138 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:26.100107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z7mcm" event={"ID":"31cc4262-7db2-4244-bdc3-0860ad56b440","Type":"ContainerStarted","Data":"4c0a9328a4e0be0676fdf1207aedab8a67db685bbb9ca31d788e1d116ec0b828"} Apr 20 07:15:29.113485 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:29.113450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" event={"ID":"df21d0cd-de21-4cda-85b8-7aa133837268","Type":"ContainerStarted","Data":"dc877895d84a3d01db617b3b2cdbc71cc9b867f2ee92b808e4ba36df2097896a"} Apr 20 07:15:29.113913 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:29.113554 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:29.114918 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:29.114893 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z7mcm" event={"ID":"31cc4262-7db2-4244-bdc3-0860ad56b440","Type":"ContainerStarted","Data":"0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b"} Apr 20 07:15:29.165738 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:29.165685 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" podStartSLOduration=1.585373066 podStartE2EDuration="5.165670649s" podCreationTimestamp="2026-04-20 07:15:24 +0000 UTC" firstStartedPulling="2026-04-20 07:15:24.966110782 +0000 UTC m=+747.840835621" lastFinishedPulling="2026-04-20 07:15:28.546408348 +0000 UTC m=+751.421133204" observedRunningTime="2026-04-20 07:15:29.144749099 +0000 UTC m=+752.019473961" watchObservedRunningTime="2026-04-20 07:15:29.165670649 +0000 UTC m=+752.040395530" Apr 20 07:15:29.167303 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:29.167272 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-z7mcm" podStartSLOduration=1.339132217 podStartE2EDuration="4.167258541s" podCreationTimestamp="2026-04-20 07:15:25 +0000 UTC" firstStartedPulling="2026-04-20 07:15:25.664711436 +0000 UTC m=+748.539436276" lastFinishedPulling="2026-04-20 07:15:28.492837758 +0000 UTC m=+751.367562600" observedRunningTime="2026-04-20 07:15:29.164011901 +0000 UTC m=+752.038736786" watchObservedRunningTime="2026-04-20 07:15:29.167258541 +0000 UTC m=+752.041983405" Apr 20 07:15:40.118870 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:40.118841 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-k9j9p" Apr 20 07:15:59.959781 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:59.959750 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-z7mcm"] Apr 20 07:15:59.960306 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:15:59.959962 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-z7mcm" podUID="31cc4262-7db2-4244-bdc3-0860ad56b440" containerName="authorino" containerID="cri-o://0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b" gracePeriod=30 Apr 20 07:16:00.196722 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.196683 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:16:00.233087 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.233003 2577 generic.go:358] "Generic (PLEG): container finished" podID="31cc4262-7db2-4244-bdc3-0860ad56b440" containerID="0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b" exitCode=0 Apr 20 07:16:00.233087 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.233045 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-z7mcm" Apr 20 07:16:00.233270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.233099 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z7mcm" event={"ID":"31cc4262-7db2-4244-bdc3-0860ad56b440","Type":"ContainerDied","Data":"0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b"} Apr 20 07:16:00.233270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.233134 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-z7mcm" event={"ID":"31cc4262-7db2-4244-bdc3-0860ad56b440","Type":"ContainerDied","Data":"4c0a9328a4e0be0676fdf1207aedab8a67db685bbb9ca31d788e1d116ec0b828"} Apr 20 07:16:00.233270 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.233150 2577 scope.go:117] "RemoveContainer" containerID="0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b" Apr 20 07:16:00.241835 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.241811 2577 scope.go:117] "RemoveContainer" containerID="0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b" Apr 20 07:16:00.242096 ip-10-0-138-178 kubenswrapper[2577]: E0420 07:16:00.242080 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b\": container with ID starting with 0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b not found: ID does not exist" containerID="0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b" Apr 20 07:16:00.242158 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.242109 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b"} err="failed to get container status \"0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b\": rpc error: code = NotFound desc = could not find container \"0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b\": container with ID starting with 0800eed9543d6177220b57e9773ab40228e55d796c32b5ff513a52b93ba6876b not found: ID does not exist" Apr 20 07:16:00.284042 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.284014 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwzf8\" (UniqueName: \"kubernetes.io/projected/31cc4262-7db2-4244-bdc3-0860ad56b440-kube-api-access-hwzf8\") pod \"31cc4262-7db2-4244-bdc3-0860ad56b440\" (UID: \"31cc4262-7db2-4244-bdc3-0860ad56b440\") " Apr 20 07:16:00.286256 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.286229 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31cc4262-7db2-4244-bdc3-0860ad56b440-kube-api-access-hwzf8" (OuterVolumeSpecName: "kube-api-access-hwzf8") pod "31cc4262-7db2-4244-bdc3-0860ad56b440" (UID: "31cc4262-7db2-4244-bdc3-0860ad56b440"). InnerVolumeSpecName "kube-api-access-hwzf8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:16:00.384787 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.384756 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwzf8\" (UniqueName: \"kubernetes.io/projected/31cc4262-7db2-4244-bdc3-0860ad56b440-kube-api-access-hwzf8\") on node \"ip-10-0-138-178.ec2.internal\" DevicePath \"\"" Apr 20 07:16:00.554470 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.554403 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-z7mcm"] Apr 20 07:16:00.556296 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:00.556276 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-z7mcm"] Apr 20 07:16:01.644662 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:01.644628 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31cc4262-7db2-4244-bdc3-0860ad56b440" path="/var/lib/kubelet/pods/31cc4262-7db2-4244-bdc3-0860ad56b440/volumes" Apr 20 07:16:06.859101 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:06.859021 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:16:40.383522 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:40.383484 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:16:49.879188 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:16:49.879141 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:17:23.582151 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:17:23.582114 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:17:26.776000 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:17:26.775967 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:17:33.676225 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:17:33.676187 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:17:40.780442 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:17:40.780403 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:18:14.368773 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.368740 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7ccdb9c7df-m7fn8"] Apr 20 07:18:14.369241 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.369108 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31cc4262-7db2-4244-bdc3-0860ad56b440" containerName="authorino" Apr 20 07:18:14.369241 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.369119 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="31cc4262-7db2-4244-bdc3-0860ad56b440" containerName="authorino" Apr 20 07:18:14.369241 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.369191 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="31cc4262-7db2-4244-bdc3-0860ad56b440" containerName="authorino" Apr 20 07:18:14.372402 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.372384 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.375771 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.375746 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 07:18:14.375879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.375785 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6clll\"" Apr 20 07:18:14.378882 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.378860 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7ccdb9c7df-m7fn8"] Apr 20 07:18:14.447650 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.447619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr65d\" (UniqueName: \"kubernetes.io/projected/7d83aa7a-38e4-4154-a8fe-018e4fb578e9-kube-api-access-nr65d\") pod \"authorino-7ccdb9c7df-m7fn8\" (UID: \"7d83aa7a-38e4-4154-a8fe-018e4fb578e9\") " pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.447802 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.447683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7d83aa7a-38e4-4154-a8fe-018e4fb578e9-tls-cert\") pod \"authorino-7ccdb9c7df-m7fn8\" (UID: \"7d83aa7a-38e4-4154-a8fe-018e4fb578e9\") " pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.549059 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.549026 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7d83aa7a-38e4-4154-a8fe-018e4fb578e9-tls-cert\") pod \"authorino-7ccdb9c7df-m7fn8\" (UID: \"7d83aa7a-38e4-4154-a8fe-018e4fb578e9\") " pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.549239 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.549076 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr65d\" (UniqueName: \"kubernetes.io/projected/7d83aa7a-38e4-4154-a8fe-018e4fb578e9-kube-api-access-nr65d\") pod \"authorino-7ccdb9c7df-m7fn8\" (UID: \"7d83aa7a-38e4-4154-a8fe-018e4fb578e9\") " pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.551814 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.551787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/7d83aa7a-38e4-4154-a8fe-018e4fb578e9-tls-cert\") pod \"authorino-7ccdb9c7df-m7fn8\" (UID: \"7d83aa7a-38e4-4154-a8fe-018e4fb578e9\") " pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.557003 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.556981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr65d\" (UniqueName: \"kubernetes.io/projected/7d83aa7a-38e4-4154-a8fe-018e4fb578e9-kube-api-access-nr65d\") pod \"authorino-7ccdb9c7df-m7fn8\" (UID: \"7d83aa7a-38e4-4154-a8fe-018e4fb578e9\") " pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.683342 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.683247 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" Apr 20 07:18:14.809969 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:14.809945 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7ccdb9c7df-m7fn8"] Apr 20 07:18:14.813578 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:18:14.813542 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d83aa7a_38e4_4154_a8fe_018e4fb578e9.slice/crio-0d88757939b2fd96fad32413b8cba22ab4a00cc66e1546f83eb9379c501e2390 WatchSource:0}: Error finding container 0d88757939b2fd96fad32413b8cba22ab4a00cc66e1546f83eb9379c501e2390: Status 404 returned error can't find the container with id 0d88757939b2fd96fad32413b8cba22ab4a00cc66e1546f83eb9379c501e2390 Apr 20 07:18:15.707881 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:15.707840 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" event={"ID":"7d83aa7a-38e4-4154-a8fe-018e4fb578e9","Type":"ContainerStarted","Data":"da455213d47a921b584da019e766c9e37939f0aa4ec111eb9b561243f65adc7a"} Apr 20 07:18:15.707881 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:15.707885 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" event={"ID":"7d83aa7a-38e4-4154-a8fe-018e4fb578e9","Type":"ContainerStarted","Data":"0d88757939b2fd96fad32413b8cba22ab4a00cc66e1546f83eb9379c501e2390"} Apr 20 07:18:15.732151 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:15.732094 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7ccdb9c7df-m7fn8" podStartSLOduration=1.248004892 podStartE2EDuration="1.732078309s" podCreationTimestamp="2026-04-20 07:18:14 +0000 UTC" firstStartedPulling="2026-04-20 07:18:14.815418729 +0000 UTC m=+917.690143582" lastFinishedPulling="2026-04-20 07:18:15.299492138 +0000 UTC m=+918.174216999" observedRunningTime="2026-04-20 07:18:15.731131333 +0000 UTC m=+918.605856200" watchObservedRunningTime="2026-04-20 07:18:15.732078309 +0000 UTC m=+918.606803170" Apr 20 07:18:33.477190 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:33.477151 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:18:38.279199 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:38.279162 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:18:44.377357 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:44.377285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:18:54.384386 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:18:54.384308 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:19:03.377563 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:19:03.377529 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:19:14.278273 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:19:14.278191 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:19:22.975985 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:19:22.975952 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:19:33.209076 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:19:33.209033 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:20:35.676146 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:20:35.676057 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:20:51.373975 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:20:51.373940 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:21:29.686008 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:21:29.685973 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:21:45.581090 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:21:45.581052 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:22:00.301818 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:22:00.301784 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:22:16.470476 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:22:16.470383 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:23:10.713083 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:23:10.713047 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:23:20.293040 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:23:20.293008 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:23:36.776100 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:23:36.776010 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:23:44.679625 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:23:44.679593 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:24:02.179235 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:24:02.179196 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:24:10.074980 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:24:10.074944 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:24:43.178218 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:24:43.178183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:24:51.081234 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:24:51.081200 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:24:59.669623 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:24:59.669589 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:25:08.678075 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:25:08.677988 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:25:16.771572 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:25:16.771531 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:25:33.880745 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:25:33.880710 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:25:44.374967 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:25:44.374926 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:26:32.786003 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:26:32.785969 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:26:40.179106 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:26:40.179023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:26:50.077267 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:26:50.077224 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:26:57.309547 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:26:57.309509 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:06.780870 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:06.780833 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:15.074366 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:15.074308 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:24.385231 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:24.385191 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:33.376681 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:33.376645 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:42.374552 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:42.374513 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:50.379435 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:50.379395 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:27:59.475480 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:27:59.475441 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:07.787610 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:07.787518 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:16.696006 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:16.695968 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:25.076043 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:25.076005 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:34.181868 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:34.181829 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:41.975484 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:41.975452 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:51.278057 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:51.278019 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:28:59.382942 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:28:59.382903 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:31:15.877905 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:31:15.877820 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:31:22.172614 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:31:22.172577 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:31:48.376268 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:31:48.376231 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:31:55.779261 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:31:55.779222 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:32:04.767198 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:32:04.767159 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:32:14.168343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:32:14.168295 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:32:23.076219 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:32:23.076181 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:32:33.887337 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:32:33.887282 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:32:43.379925 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:32:43.379889 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:32:53.877030 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:32:53.876989 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:33:01.771817 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:33:01.771780 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:33:12.178518 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:33:12.178478 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:33:21.483908 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:33:21.483870 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:33:27.276582 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:33:27.276543 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:33:54.287306 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:33:54.287264 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:34:36.857455 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:34:36.857375 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:34:46.082576 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:34:46.082535 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:34:54.078935 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:34:54.078900 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:02.582868 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:02.582827 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:11.778018 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:11.777979 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:24.188027 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:24.187992 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:33.674973 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:33.674934 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:41.871798 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:41.871719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:49.777873 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:49.777844 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:35:58.179597 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:35:58.179557 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:36:06.578997 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:36:06.578962 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:36:17.084840 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:36:17.084796 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:36:34.981926 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:36:34.981892 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:36:42.788409 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:36:42.788370 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:36:51.781445 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:36:51.781413 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:36:59.871387 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:36:59.871350 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:37:17.274396 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:37:17.274295 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:37:25.482726 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:37:25.482688 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:37:33.872082 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:37:33.872047 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:37:42.264735 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:37:42.264702 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:37:51.475782 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:37:51.475743 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:38:00.272113 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:38:00.272076 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:38:08.966426 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:38:08.966381 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:38:22.270460 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:38:22.270417 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:38:31.275044 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:38:31.274990 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:38:44.670009 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:38:44.669922 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:38:54.874067 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:38:54.874031 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:00.572740 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:00.572695 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:09.778434 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:09.778382 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:17.471222 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:17.471187 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:34.668205 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:34.668166 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:42.980426 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:42.980381 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:50.966256 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:50.966216 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:39:59.065627 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:39:59.065595 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:40:23.668162 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:23.668087 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:40:35.083439 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:35.083399 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-k9j9p"] Apr 20 07:40:37.315700 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:37.315659 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7ccdb9c7df-m7fn8_7d83aa7a-38e4-4154-a8fe-018e4fb578e9/authorino/0.log" Apr 20 07:40:41.329514 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:41.329480 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6d65d76454-blhjt_5d1d7a4d-2b39-44a1-9240-848919e4bd62/manager/0.log" Apr 20 07:40:42.858203 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:42.858170 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7ccdb9c7df-m7fn8_7d83aa7a-38e4-4154-a8fe-018e4fb578e9/authorino/0.log" Apr 20 07:40:43.175835 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:43.175760 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-tz82b_d5c54bf7-12b2-4457-829e-37a45bfe4574/kuadrant-console-plugin/0.log" Apr 20 07:40:43.504527 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:43.504455 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-k9j9p_df21d0cd-de21-4cda-85b8-7aa133837268/limitador/0.log" Apr 20 07:40:44.245519 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:44.245491 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bbb6d54d8-b2nj2_0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9/kube-auth-proxy/0.log" Apr 20 07:40:48.941173 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:48.941131 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfwg2/must-gather-qt7xg"] Apr 20 07:40:48.944819 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:48.944798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:48.947246 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:48.947221 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xfwg2\"/\"default-dockercfg-df242\"" Apr 20 07:40:48.947376 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:48.947275 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xfwg2\"/\"openshift-service-ca.crt\"" Apr 20 07:40:48.947376 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:48.947279 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xfwg2\"/\"kube-root-ca.crt\"" Apr 20 07:40:48.961416 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:48.961395 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/must-gather-qt7xg"] Apr 20 07:40:49.097836 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.097804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ede0b38-81a2-4ba2-ac4b-744037120bc7-must-gather-output\") pod \"must-gather-qt7xg\" (UID: \"1ede0b38-81a2-4ba2-ac4b-744037120bc7\") " pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.097996 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.097878 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kttbw\" (UniqueName: \"kubernetes.io/projected/1ede0b38-81a2-4ba2-ac4b-744037120bc7-kube-api-access-kttbw\") pod \"must-gather-qt7xg\" (UID: \"1ede0b38-81a2-4ba2-ac4b-744037120bc7\") " pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.199343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.199255 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ede0b38-81a2-4ba2-ac4b-744037120bc7-must-gather-output\") pod \"must-gather-qt7xg\" (UID: \"1ede0b38-81a2-4ba2-ac4b-744037120bc7\") " pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.199343 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.199316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kttbw\" (UniqueName: \"kubernetes.io/projected/1ede0b38-81a2-4ba2-ac4b-744037120bc7-kube-api-access-kttbw\") pod \"must-gather-qt7xg\" (UID: \"1ede0b38-81a2-4ba2-ac4b-744037120bc7\") " pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.199627 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.199605 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1ede0b38-81a2-4ba2-ac4b-744037120bc7-must-gather-output\") pod \"must-gather-qt7xg\" (UID: \"1ede0b38-81a2-4ba2-ac4b-744037120bc7\") " pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.207104 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.207076 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kttbw\" (UniqueName: \"kubernetes.io/projected/1ede0b38-81a2-4ba2-ac4b-744037120bc7-kube-api-access-kttbw\") pod \"must-gather-qt7xg\" (UID: \"1ede0b38-81a2-4ba2-ac4b-744037120bc7\") " pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.253568 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.253542 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/must-gather-qt7xg" Apr 20 07:40:49.373000 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.372825 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/must-gather-qt7xg"] Apr 20 07:40:49.375313 ip-10-0-138-178 kubenswrapper[2577]: W0420 07:40:49.375280 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ede0b38_81a2_4ba2_ac4b_744037120bc7.slice/crio-9069bee435bd5b3482f07994239a58f1994bab4f3fd7280ea14fd9071e83553e WatchSource:0}: Error finding container 9069bee435bd5b3482f07994239a58f1994bab4f3fd7280ea14fd9071e83553e: Status 404 returned error can't find the container with id 9069bee435bd5b3482f07994239a58f1994bab4f3fd7280ea14fd9071e83553e Apr 20 07:40:49.377014 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:49.376996 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:40:50.353773 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:50.353739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/must-gather-qt7xg" event={"ID":"1ede0b38-81a2-4ba2-ac4b-744037120bc7","Type":"ContainerStarted","Data":"9069bee435bd5b3482f07994239a58f1994bab4f3fd7280ea14fd9071e83553e"} Apr 20 07:40:51.365892 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:51.365849 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/must-gather-qt7xg" event={"ID":"1ede0b38-81a2-4ba2-ac4b-744037120bc7","Type":"ContainerStarted","Data":"ec28846058fc0f84277eb76e8b6d667c49037619e647a2d4a84e8de19881946f"} Apr 20 07:40:51.366388 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:51.365901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/must-gather-qt7xg" event={"ID":"1ede0b38-81a2-4ba2-ac4b-744037120bc7","Type":"ContainerStarted","Data":"15c8197b30889b38176ef5af92910befcf4a6b726b589cf0e3d3ed492f4e7e7e"} Apr 20 07:40:51.382910 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:51.382857 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfwg2/must-gather-qt7xg" podStartSLOduration=2.515753767 podStartE2EDuration="3.382840198s" podCreationTimestamp="2026-04-20 07:40:48 +0000 UTC" firstStartedPulling="2026-04-20 07:40:49.377154547 +0000 UTC m=+2272.251879390" lastFinishedPulling="2026-04-20 07:40:50.244240979 +0000 UTC m=+2273.118965821" observedRunningTime="2026-04-20 07:40:51.380432963 +0000 UTC m=+2274.255157820" watchObservedRunningTime="2026-04-20 07:40:51.382840198 +0000 UTC m=+2274.257565062" Apr 20 07:40:51.874276 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:51.874231 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-87wvx_e578f271-ddce-475b-ba48-2e815f1f88f9/global-pull-secret-syncer/0.log" Apr 20 07:40:52.084336 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:52.084279 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-p77qf_72490a6e-b0ae-4620-b092-0da4bc44739a/konnectivity-agent/0.log" Apr 20 07:40:52.129180 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:52.129098 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-178.ec2.internal_e599c436f00e4332b7e91bed4b9b4c68/haproxy/0.log" Apr 20 07:40:56.111098 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:56.110764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7ccdb9c7df-m7fn8_7d83aa7a-38e4-4154-a8fe-018e4fb578e9/authorino/0.log" Apr 20 07:40:56.210715 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:56.210682 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-tz82b_d5c54bf7-12b2-4457-829e-37a45bfe4574/kuadrant-console-plugin/0.log" Apr 20 07:40:56.377432 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:56.377307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-k9j9p_df21d0cd-de21-4cda-85b8-7aa133837268/limitador/0.log" Apr 20 07:40:57.981516 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:57.981471 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/alertmanager/0.log" Apr 20 07:40:58.007364 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.007332 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/config-reloader/0.log" Apr 20 07:40:58.028149 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.028121 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/kube-rbac-proxy-web/0.log" Apr 20 07:40:58.049374 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.049280 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/kube-rbac-proxy/0.log" Apr 20 07:40:58.071353 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.071304 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/kube-rbac-proxy-metric/0.log" Apr 20 07:40:58.092856 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.092826 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/prom-label-proxy/0.log" Apr 20 07:40:58.113082 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.113046 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1bac0ecf-cc26-4474-8e34-da870cf45d49/init-config-reloader/0.log" Apr 20 07:40:58.199289 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.199262 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-whfbx_45e5b9d7-edf1-4444-bb94-1c69ca165f1a/kube-state-metrics/0.log" Apr 20 07:40:58.221051 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.221018 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-whfbx_45e5b9d7-edf1-4444-bb94-1c69ca165f1a/kube-rbac-proxy-main/0.log" Apr 20 07:40:58.241228 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.241193 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-whfbx_45e5b9d7-edf1-4444-bb94-1c69ca165f1a/kube-rbac-proxy-self/0.log" Apr 20 07:40:58.280478 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.280427 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-58c9f75f5b-9bxz2_22adf421-aa3d-41da-b9d2-894e12e86ca2/metrics-server/0.log" Apr 20 07:40:58.305131 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.305052 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-ccppv_95ccc011-8001-42ed-acd3-f602333f4f36/monitoring-plugin/0.log" Apr 20 07:40:58.478588 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.478556 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sktgd_a7829208-2d3d-44ab-b7c6-c79c619a90d2/node-exporter/0.log" Apr 20 07:40:58.505787 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.505758 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sktgd_a7829208-2d3d-44ab-b7c6-c79c619a90d2/kube-rbac-proxy/0.log" Apr 20 07:40:58.525398 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.525372 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sktgd_a7829208-2d3d-44ab-b7c6-c79c619a90d2/init-textfile/0.log" Apr 20 07:40:58.554667 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.554625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pb6zc_ae37004f-672e-407b-a0d3-69378d08f058/kube-rbac-proxy-main/0.log" Apr 20 07:40:58.575887 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.575813 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pb6zc_ae37004f-672e-407b-a0d3-69378d08f058/kube-rbac-proxy-self/0.log" Apr 20 07:40:58.596534 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.596499 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-pb6zc_ae37004f-672e-407b-a0d3-69378d08f058/openshift-state-metrics/0.log" Apr 20 07:40:58.630448 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.630409 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/prometheus/0.log" Apr 20 07:40:58.650294 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.650260 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/config-reloader/0.log" Apr 20 07:40:58.669776 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.669743 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/thanos-sidecar/0.log" Apr 20 07:40:58.691058 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.691021 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/kube-rbac-proxy-web/0.log" Apr 20 07:40:58.716669 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.716635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/kube-rbac-proxy/0.log" Apr 20 07:40:58.740669 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.740633 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/kube-rbac-proxy-thanos/0.log" Apr 20 07:40:58.763879 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.763803 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d6b57012-62e2-4cd4-9106-f4d360314967/init-config-reloader/0.log" Apr 20 07:40:58.793135 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.793104 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rjzn8_5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5/prometheus-operator/0.log" Apr 20 07:40:58.815670 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.815636 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rjzn8_5ea4232c-c32a-4b9d-9d64-bcbd9a892ef5/kube-rbac-proxy/0.log" Apr 20 07:40:58.880268 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.880221 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d86b6fdf5-hbb4j_432d1ef5-166a-4475-a17c-1cdb5b648cdc/telemeter-client/0.log" Apr 20 07:40:58.899587 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.899560 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d86b6fdf5-hbb4j_432d1ef5-166a-4475-a17c-1cdb5b648cdc/reload/0.log" Apr 20 07:40:58.921669 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.921629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-d86b6fdf5-hbb4j_432d1ef5-166a-4475-a17c-1cdb5b648cdc/kube-rbac-proxy/0.log" Apr 20 07:40:58.949763 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.949697 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5649bb759-724x4_dfbfa478-8079-4924-ab6d-bf9064e82051/thanos-query/0.log" Apr 20 07:40:58.969500 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.969475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5649bb759-724x4_dfbfa478-8079-4924-ab6d-bf9064e82051/kube-rbac-proxy-web/0.log" Apr 20 07:40:58.990350 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:58.990299 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5649bb759-724x4_dfbfa478-8079-4924-ab6d-bf9064e82051/kube-rbac-proxy/0.log" Apr 20 07:40:59.010833 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:59.010805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5649bb759-724x4_dfbfa478-8079-4924-ab6d-bf9064e82051/prom-label-proxy/0.log" Apr 20 07:40:59.052265 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:59.052137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5649bb759-724x4_dfbfa478-8079-4924-ab6d-bf9064e82051/kube-rbac-proxy-rules/0.log" Apr 20 07:40:59.074066 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:40:59.073975 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5649bb759-724x4_dfbfa478-8079-4924-ab6d-bf9064e82051/kube-rbac-proxy-metrics/0.log" Apr 20 07:41:00.754006 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.753970 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q"] Apr 20 07:41:00.760637 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.760610 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.772449 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.772423 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q"] Apr 20 07:41:00.815667 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.815608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-proc\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.815840 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.815684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-lib-modules\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.815840 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.815775 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-podres\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.815840 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.815813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-sys\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.816012 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.815863 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z25bz\" (UniqueName: \"kubernetes.io/projected/9161d116-298b-4f13-af22-f25fa0159d19-kube-api-access-z25bz\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917164 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-lib-modules\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917384 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917225 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-podres\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917384 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-sys\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917384 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z25bz\" (UniqueName: \"kubernetes.io/projected/9161d116-298b-4f13-af22-f25fa0159d19-kube-api-access-z25bz\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917384 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917343 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-lib-modules\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917384 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-proc\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917656 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917469 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-proc\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917656 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-podres\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.917656 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.917572 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9161d116-298b-4f13-af22-f25fa0159d19-sys\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:00.926450 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:00.926415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z25bz\" (UniqueName: \"kubernetes.io/projected/9161d116-298b-4f13-af22-f25fa0159d19-kube-api-access-z25bz\") pod \"perf-node-gather-daemonset-v7g9q\" (UID: \"9161d116-298b-4f13-af22-f25fa0159d19\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:01.075293 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:01.075218 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:01.241520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:01.237894 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q"] Apr 20 07:41:01.424655 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:01.424616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" event={"ID":"9161d116-298b-4f13-af22-f25fa0159d19","Type":"ContainerStarted","Data":"89f2675a9f842f492ae3ceaaa3c6b0ec839ee7f2dc078b1e4cf1db5c8a5d0ee6"} Apr 20 07:41:01.424655 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:01.424657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" event={"ID":"9161d116-298b-4f13-af22-f25fa0159d19","Type":"ContainerStarted","Data":"93d0ceac76302caecc5720ea05afe678e4f8f6b5900e5811b0284e5ca5889950"} Apr 20 07:41:01.424880 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:01.424697 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:01.441501 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:01.441456 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" podStartSLOduration=1.441440693 podStartE2EDuration="1.441440693s" podCreationTimestamp="2026-04-20 07:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:41:01.439294532 +0000 UTC m=+2284.314019396" watchObservedRunningTime="2026-04-20 07:41:01.441440693 +0000 UTC m=+2284.316165555" Apr 20 07:41:02.847442 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:02.847409 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x528h_d38a5ffa-c2cb-44b1-823f-627443e996e0/dns/0.log" Apr 20 07:41:02.867765 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:02.867735 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x528h_d38a5ffa-c2cb-44b1-823f-627443e996e0/kube-rbac-proxy/0.log" Apr 20 07:41:02.888106 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:02.888079 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cz5pm_c340eac2-744d-4e2d-a3d3-b064aeed4bde/dns-node-resolver/0.log" Apr 20 07:41:03.397784 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:03.397755 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-956b6_9a4f5c92-f99d-423c-a0fa-9f9d79abbd7c/node-ca/0.log" Apr 20 07:41:04.461280 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:04.461254 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6bbb6d54d8-b2nj2_0b9d6fbe-d32b-4e92-9bf1-19eea864a8d9/kube-auth-proxy/0.log" Apr 20 07:41:05.062909 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:05.062877 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gtfrg_810eac21-fbc0-4eb1-b7bc-e6cb6a6b694c/serve-healthcheck-canary/0.log" Apr 20 07:41:05.634391 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:05.634363 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fr8mx_e21e2393-ce5e-4856-84ba-7b921e943d5f/kube-rbac-proxy/0.log" Apr 20 07:41:05.652637 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:05.652613 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fr8mx_e21e2393-ce5e-4856-84ba-7b921e943d5f/exporter/0.log" Apr 20 07:41:05.672491 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:05.672469 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fr8mx_e21e2393-ce5e-4856-84ba-7b921e943d5f/extractor/0.log" Apr 20 07:41:07.440196 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:07.440165 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-v7g9q" Apr 20 07:41:07.832871 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:07.832829 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6d65d76454-blhjt_5d1d7a4d-2b39-44a1-9240-848919e4bd62/manager/0.log" Apr 20 07:41:09.109347 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:09.109300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7fd89bcbc4-zqq4d_755fe1ea-f3fe-4f2f-8d1e-bc985dd3a8ca/manager/0.log" Apr 20 07:41:09.136929 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:09.136900 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-25nr2_31f5c69d-ef1d-4760-a5c5-b55a243c2ba5/openshift-lws-operator/0.log" Apr 20 07:41:14.862235 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.862201 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/kube-multus-additional-cni-plugins/0.log" Apr 20 07:41:14.884011 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.883980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/egress-router-binary-copy/0.log" Apr 20 07:41:14.909367 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.909341 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/cni-plugins/0.log" Apr 20 07:41:14.932288 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.932263 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/bond-cni-plugin/0.log" Apr 20 07:41:14.952437 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.952411 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/routeoverride-cni/0.log" Apr 20 07:41:14.973897 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.973850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/whereabouts-cni-bincopy/0.log" Apr 20 07:41:14.992888 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:14.992862 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j8rhj_d333a39d-cbf2-4ee7-b67a-a1c9fa762779/whereabouts-cni/0.log" Apr 20 07:41:15.349249 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:15.349225 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j55n5_6176a82b-f332-4179-a624-af63e267945f/kube-multus/0.log" Apr 20 07:41:15.477520 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:15.477488 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xw95j_3bdbb47e-e79d-4aaa-9671-3899c229b1a2/network-metrics-daemon/0.log" Apr 20 07:41:15.494583 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:15.494561 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xw95j_3bdbb47e-e79d-4aaa-9671-3899c229b1a2/kube-rbac-proxy/0.log" Apr 20 07:41:16.512028 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.511979 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/ovn-controller/0.log" Apr 20 07:41:16.547486 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.547442 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/ovn-acl-logging/0.log" Apr 20 07:41:16.564633 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.564603 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/kube-rbac-proxy-node/0.log" Apr 20 07:41:16.583174 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.583151 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 07:41:16.601211 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.601171 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/northd/0.log" Apr 20 07:41:16.619254 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.619233 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/nbdb/0.log" Apr 20 07:41:16.637619 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.637598 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/sbdb/0.log" Apr 20 07:41:16.814841 ip-10-0-138-178 kubenswrapper[2577]: I0420 07:41:16.814761 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qvqrn_8105e34c-977b-4426-8e5f-aacd3adbb4c9/ovnkube-controller/0.log"