Apr 16 14:29:44.052831 ip-10-0-128-173 systemd[1]: Starting Kubernetes Kubelet... Apr 16 14:29:44.500266 ip-10-0-128-173 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:44.500266 ip-10-0-128-173 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 14:29:44.500266 ip-10-0-128-173 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:44.500266 ip-10-0-128-173 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 14:29:44.500266 ip-10-0-128-173 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 14:29:44.502652 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.502555 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 14:29:44.509677 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509649 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:44.509677 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509671 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:44.509677 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509675 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:44.509677 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509679 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:44.509677 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509682 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509688 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509692 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509696 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509699 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509702 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509705 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509708 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509710 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509713 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509716 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509719 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509722 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509724 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509727 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509730 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509732 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509735 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509737 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509740 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:44.509897 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509743 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509746 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509757 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509760 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509762 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509765 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509768 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509770 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509773 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509776 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509779 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509781 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509784 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509787 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509791 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509794 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509796 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509799 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509802 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509804 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:44.510379 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509807 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509810 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509812 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509815 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509818 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509823 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509827 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509830 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509833 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509836 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509838 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509841 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509844 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509847 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509850 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509852 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509855 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509858 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509861 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509863 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:44.510890 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509866 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509868 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509871 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509874 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509877 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509879 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509885 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509887 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509890 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509893 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509895 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509898 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509901 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509905 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509909 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509912 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509915 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509917 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509920 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:44.511374 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509923 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509925 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.509928 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510351 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510356 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510359 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510361 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510364 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510367 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510370 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510372 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510375 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510378 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510380 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510383 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510386 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510389 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510392 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510397 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510401 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:44.511845 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510404 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510407 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510410 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510413 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510416 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510419 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510422 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510426 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510429 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510433 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510436 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510439 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510441 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510444 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510447 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510450 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510452 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510454 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510457 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510460 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:44.512345 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510462 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510465 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510467 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510470 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510472 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510475 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510478 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510480 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510483 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510486 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510489 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510492 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510494 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510497 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510501 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510504 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510507 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510509 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510512 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:44.512931 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510514 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510518 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510520 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510523 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510525 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510528 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510547 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510550 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510552 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510555 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510558 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510561 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510563 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510566 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510569 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510572 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510575 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510578 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510580 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:44.513391 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510583 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510586 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510588 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510591 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510596 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510598 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510601 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510603 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510606 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510609 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.510611 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511887 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511903 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511910 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511915 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511920 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511923 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511928 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511932 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511936 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511939 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511942 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 14:29:44.513877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511945 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511949 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511952 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511955 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511958 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511961 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511964 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511967 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511972 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511974 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511978 2576 flags.go:64] FLAG: --config-dir="" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511980 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511984 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511988 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511992 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511995 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.511999 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512002 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512005 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512008 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512012 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512015 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512019 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512022 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512025 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512028 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 14:29:44.514405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512032 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512035 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512040 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512043 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512046 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512050 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512053 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512057 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512061 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512064 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512067 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512070 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512073 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512076 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512079 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512082 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512085 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512088 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512093 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512096 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512099 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512102 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512105 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512109 2576 flags.go:64] FLAG: --help="false" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512112 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.515047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512115 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512118 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512121 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512124 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512128 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512131 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512134 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512137 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512140 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512144 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512147 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512157 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512160 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512163 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512166 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512169 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512172 2576 flags.go:64] FLAG: --lock-file="" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512176 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512180 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512183 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512189 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512192 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512194 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512197 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 14:29:44.515685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512200 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512204 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512207 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512210 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512215 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512218 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512222 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512225 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512228 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512231 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512234 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512238 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512241 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512244 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512253 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512256 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512259 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512264 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512267 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512272 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512275 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512278 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512282 2576 flags.go:64] FLAG: --port="10250" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512285 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 14:29:44.516272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512288 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01711b8726f6738db" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512291 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512294 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512297 2576 flags.go:64] FLAG: --register-node="true" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512300 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512303 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512307 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512309 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512312 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512315 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512319 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512322 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512325 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512328 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512330 2576 flags.go:64] FLAG: --runonce="false" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512333 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512336 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512339 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512342 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512344 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512348 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512352 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512355 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512358 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512361 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512364 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 14:29:44.516899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512367 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512370 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512373 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512376 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512382 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512388 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512391 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512395 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512398 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512401 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512405 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512408 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512411 2576 flags.go:64] FLAG: --v="2" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512415 2576 flags.go:64] FLAG: --version="false" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512424 2576 flags.go:64] FLAG: --vmodule="" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512429 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.512432 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512552 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512556 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512559 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512562 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512565 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512568 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512571 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:44.517511 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512573 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512576 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512579 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512582 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512585 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512587 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512590 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512593 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512596 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512599 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512601 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512604 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512607 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512611 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512614 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512616 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512619 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512622 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512624 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512627 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:44.518128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512631 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512634 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512636 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512639 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512641 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512644 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512647 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512649 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512652 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512656 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512658 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512661 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512664 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512666 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512669 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512671 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512674 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512677 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512680 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:44.518664 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512682 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512685 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512687 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512690 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512693 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512696 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512700 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512703 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512705 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512708 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512710 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512713 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512716 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512719 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512721 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512723 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512726 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512729 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512731 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:44.519126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512734 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512736 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512739 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512741 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512744 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512746 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512749 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512751 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512754 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512757 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512759 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512762 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512766 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512770 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512773 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512776 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512779 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512782 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512788 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:44.519623 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512793 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:44.520083 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.512797 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:44.520083 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.513476 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:44.520197 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.520173 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 14:29:44.520228 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.520198 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 14:29:44.520264 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520255 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:44.520264 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520264 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520267 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520271 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520275 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520279 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520283 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520286 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520289 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520291 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520294 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520297 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520300 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520302 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520305 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520308 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520310 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520313 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520316 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520319 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520322 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:44.520317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520324 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520327 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520330 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520333 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520336 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520338 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520341 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520344 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520346 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520349 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520358 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520360 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520363 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520366 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520368 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520371 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520373 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520376 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520378 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520381 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:44.520836 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520384 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520387 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520389 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520392 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520394 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520397 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520399 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520402 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520404 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520407 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520410 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520412 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520415 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520418 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520421 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520423 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520426 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520429 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520432 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520434 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:44.521338 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520437 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520439 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520442 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520445 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520448 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520452 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520456 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520459 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520462 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520465 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520468 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520471 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520474 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520476 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520479 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520481 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520484 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520487 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520490 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520493 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:44.521872 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520496 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520498 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520501 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520504 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520507 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.520513 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520636 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520642 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520646 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520649 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520652 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520655 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520657 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520660 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520662 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 14:29:44.522351 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520665 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520668 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520671 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520674 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520676 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520679 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520681 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520684 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520686 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520689 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520692 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520694 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520697 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520699 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520702 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520704 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520707 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520710 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520713 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520715 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 14:29:44.522739 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520719 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520722 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520725 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520727 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520730 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520732 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520735 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520738 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520740 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520743 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520746 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520748 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520751 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520753 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520756 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520759 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520761 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520764 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520767 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520769 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 14:29:44.523228 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520771 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520774 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520777 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520779 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520783 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520786 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520790 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520793 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520797 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520800 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520803 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520806 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520809 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520811 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520814 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520817 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520819 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520822 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520825 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 14:29:44.523718 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520827 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520830 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520832 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520835 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520838 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520840 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520843 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520846 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520849 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520852 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520854 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520857 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520859 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520862 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520864 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520867 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520869 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 14:29:44.524196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:44.520872 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 14:29:44.524666 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.520877 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 14:29:44.524666 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.521671 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 14:29:44.524666 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.523731 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 14:29:44.524754 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.524683 2576 server.go:1019] "Starting client certificate rotation" Apr 16 14:29:44.524802 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.524782 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:29:44.525508 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.525496 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 14:29:44.553009 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.552976 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:29:44.557326 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.557171 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 14:29:44.571249 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.571230 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 14:29:44.576225 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.576209 2576 log.go:25] "Validated CRI v1 image API" Apr 16 14:29:44.577483 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.577466 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 14:29:44.580706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.580685 2576 fs.go:135] Filesystem UUIDs: map[03ed3bd7-e9d6-4987-b93a-8be29a5ed06f:/dev/nvme0n1p4 4594b96f-6044-46eb-b7eb-47b3deb3ca27:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 14:29:44.580776 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.580705 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 14:29:44.586254 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.586130 2576 manager.go:217] Machine: {Timestamp:2026-04-16 14:29:44.584954219 +0000 UTC m=+0.411664633 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098088 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20b07527a5ddd6bd641f06c94bcf0d SystemUUID:ec20b075-27a5-ddd6-bd64-1f06c94bcf0d BootID:0c025ab8-5150-4f81-8cc0-0193b02467c2 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:07:3d:b6:88:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:07:3d:b6:88:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:c6:07:b0:c4:39 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 14:29:44.586254 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.586243 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 14:29:44.586391 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.586331 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 14:29:44.587998 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.587966 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 14:29:44.588065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.588049 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:29:44.588156 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.588002 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-173.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 14:29:44.588212 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.588170 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 14:29:44.588212 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.588179 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 14:29:44.588212 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.588192 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:29:44.588807 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.588797 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 14:29:44.590549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.590521 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:29:44.590662 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.590653 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 14:29:44.593026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.593014 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 14:29:44.593071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.593039 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 14:29:44.593071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.593053 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 14:29:44.593071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.593064 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 14:29:44.593193 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.593077 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 14:29:44.594091 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.594079 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:29:44.594136 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.594098 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 14:29:44.596877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.596850 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 14:29:44.601077 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.601051 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 14:29:44.602660 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602644 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 14:29:44.602660 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602664 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602673 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602700 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602709 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602718 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602725 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602731 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602740 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602746 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 14:29:44.602766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602755 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 14:29:44.603028 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.602773 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 14:29:44.603936 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.603919 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 14:29:44.604938 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.604921 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 14:29:44.605158 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.605133 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 14:29:44.605229 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.605190 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-173.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 14:29:44.607225 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.607208 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-173.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 14:29:44.607428 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.607410 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vn4qz" Apr 16 14:29:44.609166 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.609151 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 14:29:44.609236 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.609190 2576 server.go:1295] "Started kubelet" Apr 16 14:29:44.609322 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.609296 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 14:29:44.609375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.609310 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 14:29:44.609417 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.609379 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 14:29:44.610103 ip-10-0-128-173 systemd[1]: Started Kubernetes Kubelet. Apr 16 14:29:44.610451 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.610426 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 14:29:44.612438 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.612420 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 14:29:44.616454 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.616437 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vn4qz" Apr 16 14:29:44.616831 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.616803 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 14:29:44.617271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.617256 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 14:29:44.618221 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.618194 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:44.618221 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.616105 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-173.ec2.internal.18a6dcb7b4788d46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-173.ec2.internal,UID:ip-10-0-128-173.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-173.ec2.internal,},FirstTimestamp:2026-04-16 14:29:44.609164614 +0000 UTC m=+0.435875021,LastTimestamp:2026-04-16 14:29:44.609164614 +0000 UTC m=+0.435875021,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-173.ec2.internal,}" Apr 16 14:29:44.618379 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.618249 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 14:29:44.618379 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.618265 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 14:29:44.618379 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.618271 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 14:29:44.618379 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.618363 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 14:29:44.618379 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.618373 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 14:29:44.619695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.619678 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 14:29:44.619817 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.619799 2576 factory.go:55] Registering systemd factory Apr 16 14:29:44.619817 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.619817 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 14:29:44.620047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.620030 2576 factory.go:153] Registering CRI-O factory Apr 16 14:29:44.620047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.620047 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 14:29:44.620187 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.620071 2576 factory.go:103] Registering Raw factory Apr 16 14:29:44.620187 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.620085 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 14:29:44.620563 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.620549 2576 manager.go:319] Starting recovery of all containers Apr 16 14:29:44.621262 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.621232 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 14:29:44.627415 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.627394 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:44.632921 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.632705 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-128-173.ec2.internal\" not found" node="ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.634012 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.633998 2576 manager.go:324] Recovery completed Apr 16 14:29:44.638405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.638392 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:44.644050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.644034 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:44.644120 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.644065 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:44.644120 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.644075 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:44.644549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.644523 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 14:29:44.644549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.644546 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 14:29:44.644681 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.644568 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 14:29:44.648272 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.648258 2576 policy_none.go:49] "None policy: Start" Apr 16 14:29:44.648348 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.648278 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 14:29:44.648348 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.648293 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 14:29:44.685139 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685119 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.685154 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685164 2576 server.go:85] "Starting device plugin registration server" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685451 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685464 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685565 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685668 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.685683 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.686238 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 14:29:44.706822 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.686301 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:44.775157 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.775059 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 14:29:44.776479 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.776447 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 14:29:44.776603 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.776489 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 14:29:44.776603 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.776516 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 14:29:44.776603 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.776524 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 14:29:44.776603 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.776581 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 14:29:44.779857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.779838 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:44.785823 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.785808 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:44.786677 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.786662 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:44.786764 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.786697 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:44.786764 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.786712 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:44.786764 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.786744 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.796590 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.796572 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.796681 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.796596 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-173.ec2.internal\": node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:44.809609 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.809589 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:44.877570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.877516 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal"] Apr 16 14:29:44.877681 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.877628 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:44.879066 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.879049 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:44.879139 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.879081 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:44.879139 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.879091 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:44.881368 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.881356 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:44.881560 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.881547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.881604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.881577 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:44.882299 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.882281 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:44.882364 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.882310 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:44.882364 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.882328 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:44.882364 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.882281 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:44.882475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.882387 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:44.882475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.882399 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:44.885044 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.885028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.885126 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.885063 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 14:29:44.885756 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.885741 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientMemory" Apr 16 14:29:44.885822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.885772 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 14:29:44.885822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.885781 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeHasSufficientPID" Apr 16 14:29:44.910111 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.910086 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:44.914376 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.914360 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-173.ec2.internal\" not found" node="ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.918893 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:44.918876 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-173.ec2.internal\" not found" node="ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.919728 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.919708 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a454133692b7d59775381b8452362b38-config\") pod \"kube-apiserver-proxy-ip-10-0-128-173.ec2.internal\" (UID: \"a454133692b7d59775381b8452362b38\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.919787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.919737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b93a143e50d6e55a1c399c4b395a32e8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal\" (UID: \"b93a143e50d6e55a1c399c4b395a32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:44.919787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:44.919752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b93a143e50d6e55a1c399c4b395a32e8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal\" (UID: \"b93a143e50d6e55a1c399c4b395a32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.010584 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.010550 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.019866 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.019837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a454133692b7d59775381b8452362b38-config\") pod \"kube-apiserver-proxy-ip-10-0-128-173.ec2.internal\" (UID: \"a454133692b7d59775381b8452362b38\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.019935 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.019872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b93a143e50d6e55a1c399c4b395a32e8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal\" (UID: \"b93a143e50d6e55a1c399c4b395a32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.019935 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.019890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b93a143e50d6e55a1c399c4b395a32e8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal\" (UID: \"b93a143e50d6e55a1c399c4b395a32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.019996 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.019941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/a454133692b7d59775381b8452362b38-config\") pod \"kube-apiserver-proxy-ip-10-0-128-173.ec2.internal\" (UID: \"a454133692b7d59775381b8452362b38\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.020033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.019945 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b93a143e50d6e55a1c399c4b395a32e8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal\" (UID: \"b93a143e50d6e55a1c399c4b395a32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.020033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.019948 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b93a143e50d6e55a1c399c4b395a32e8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal\" (UID: \"b93a143e50d6e55a1c399c4b395a32e8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.111388 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.111301 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.211809 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.211763 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.217963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.217944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.222057 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.222036 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.312200 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.312158 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.412711 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.412644 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.513159 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.513130 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.524685 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.524660 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 14:29:45.524816 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.524786 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:29:45.524855 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.524813 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 14:29:45.614186 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.614164 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.617441 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.617421 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 14:29:45.618565 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.618515 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 14:24:44 +0000 UTC" deadline="2028-01-01 14:46:35.852077476 +0000 UTC" Apr 16 14:29:45.618565 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.618562 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15000h16m50.233518576s" Apr 16 14:29:45.629748 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.629725 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 14:29:45.648228 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.648195 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n27r2" Apr 16 14:29:45.656903 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.656879 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n27r2" Apr 16 14:29:45.715290 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:45.715206 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-173.ec2.internal\" not found" Apr 16 14:29:45.734504 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:45.734457 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda454133692b7d59775381b8452362b38.slice/crio-67621ff7f22a0f15bb77e845fcdc9fb0f0f047ed2689d3e78e6f38226164a68b WatchSource:0}: Error finding container 67621ff7f22a0f15bb77e845fcdc9fb0f0f047ed2689d3e78e6f38226164a68b: Status 404 returned error can't find the container with id 67621ff7f22a0f15bb77e845fcdc9fb0f0f047ed2689d3e78e6f38226164a68b Apr 16 14:29:45.735038 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:45.735017 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb93a143e50d6e55a1c399c4b395a32e8.slice/crio-6f38dd55cd0777fa8522e29ccf3d6625c368e3848d3f8bd7422896085b4889fd WatchSource:0}: Error finding container 6f38dd55cd0777fa8522e29ccf3d6625c368e3848d3f8bd7422896085b4889fd: Status 404 returned error can't find the container with id 6f38dd55cd0777fa8522e29ccf3d6625c368e3848d3f8bd7422896085b4889fd Apr 16 14:29:45.739869 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.739844 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:29:45.779469 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.779413 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" event={"ID":"a454133692b7d59775381b8452362b38","Type":"ContainerStarted","Data":"67621ff7f22a0f15bb77e845fcdc9fb0f0f047ed2689d3e78e6f38226164a68b"} Apr 16 14:29:45.780850 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.780824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" event={"ID":"b93a143e50d6e55a1c399c4b395a32e8","Type":"ContainerStarted","Data":"6f38dd55cd0777fa8522e29ccf3d6625c368e3848d3f8bd7422896085b4889fd"} Apr 16 14:29:45.785414 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.785396 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:45.818386 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.818360 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.830325 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.830306 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:29:45.832249 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.832237 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" Apr 16 14:29:45.840459 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:45.840442 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 14:29:46.141338 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.141303 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:46.453112 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.453027 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:46.594492 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.594462 2576 apiserver.go:52] "Watching apiserver" Apr 16 14:29:46.599851 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.599826 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 14:29:46.601211 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.601181 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-cn9zb","openshift-ovn-kubernetes/ovnkube-node-p7qpv","kube-system/konnectivity-agent-mn4h5","kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb","openshift-cluster-node-tuning-operator/tuned-wg4rx","openshift-dns/node-resolver-x7zbw","openshift-image-registry/node-ca-29nkz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal","openshift-multus/multus-7cn8b","openshift-multus/multus-additional-cni-plugins-vnzc4","openshift-multus/network-metrics-daemon-kbtb7","openshift-network-diagnostics/network-check-target-4bcxv"] Apr 16 14:29:46.604343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.604319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.606701 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.606678 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.606992 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.606972 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 14:29:46.607169 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.607140 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.607275 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.607215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.607275 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.607227 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 14:29:46.607387 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.607312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-84lv2\"" Apr 16 14:29:46.608793 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.608771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.609387 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609265 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 14:29:46.609387 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609369 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.609557 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 14:29:46.609626 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609570 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.609700 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609682 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 14:29:46.609926 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609906 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-c7xln\"" Apr 16 14:29:46.609926 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.609922 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 14:29:46.611193 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.611174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.611291 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.611213 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 14:29:46.611347 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.611318 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 14:29:46.611347 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.611328 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wfqmd\"" Apr 16 14:29:46.613475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.613431 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vd8z2\"" Apr 16 14:29:46.613920 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.613604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 14:29:46.613920 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.613662 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.613920 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.613729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.615203 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.615186 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.615760 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.615736 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.615866 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.615778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x9kfh\"" Apr 16 14:29:46.615866 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.615796 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.616064 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.616045 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.618125 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.618096 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.618267 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.618169 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.618267 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.618106 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d74dj\"" Apr 16 14:29:46.618446 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.618426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.620835 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.620812 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.623309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.623289 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 14:29:46.623403 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.623361 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ddxcq\"" Apr 16 14:29:46.623587 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.623563 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.623704 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.623689 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.624145 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.624050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.625348 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.625256 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 14:29:46.625348 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.625268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:29:46.625594 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.625577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 14:29:46.625885 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.625853 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ktzjx\"" Apr 16 14:29:46.626727 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.626707 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 14:29:46.626810 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.626729 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hs6n7\"" Apr 16 14:29:46.626991 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.626971 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 14:29:46.627267 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.627248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:46.627362 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.627335 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:46.628737 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628716 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-684pl\" (UniqueName: \"kubernetes.io/projected/1491249e-0f9d-4121-abb3-d99d4022d023-kube-api-access-684pl\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.628857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j24ws\" (UniqueName: \"kubernetes.io/projected/38b0864f-2448-4948-b5f1-5231888bbbac-kube-api-access-j24ws\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.628857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efcece22-1582-4835-82c7-1489ad265dca-cni-binary-copy\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.628857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-cni-bin\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.628857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-sys-fs\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-etc-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-ovn\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3cf0a81b-ae4c-46b8-b16c-e93bd4e87102-konnectivity-ca\") pod \"konnectivity-agent-mn4h5\" (UID: \"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102\") " pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-systemd\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.628984 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhn7l\" (UniqueName: \"kubernetes.io/projected/6079e191-6467-414e-9381-0ffd91e44ab4-kube-api-access-xhn7l\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pff2r\" (UniqueName: \"kubernetes.io/projected/efcece22-1582-4835-82c7-1489ad265dca-kube-api-access-pff2r\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.629063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629085 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-kubernetes\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b102258d-1ae2-4f45-910d-aa49b03d1a3b-tmp\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-registration-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-cni-multus\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629183 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-log-socket\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovnkube-config\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysctl-conf\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr2js\" (UniqueName: \"kubernetes.io/projected/312033f4-d50f-4d5d-a1ca-6e77e0428786-kube-api-access-fr2js\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-var-lib-kubelet\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6079e191-6467-414e-9381-0ffd91e44ab4-tmp-dir\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-etc-selinux\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysconfig\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-systemd\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629394 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-os-release\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.629440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-netns\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-sys\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-cnibin\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-k8s-cni-cncf-io\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-etc-kubernetes\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629640 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/38b0864f-2448-4948-b5f1-5231888bbbac-host-slash\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-system-cni-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-device-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysctl-d\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1491249e-0f9d-4121-abb3-d99d4022d023-serviceca\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-cni-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-slash\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovnkube-script-lib\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3cf0a81b-ae4c-46b8-b16c-e93bd4e87102-agent-certs\") pod \"konnectivity-agent-mn4h5\" (UID: \"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102\") " pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-hostroot\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-node-log\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1491249e-0f9d-4121-abb3-d99d4022d023-host\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:46.630227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.629987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/38b0864f-2448-4948-b5f1-5231888bbbac-iptables-alerter-script\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-socket-dir-parent\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-kubelet\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.630044 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-run\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-tuned\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx2s7\" (UniqueName: \"kubernetes.io/projected/b102258d-1ae2-4f45-910d-aa49b03d1a3b-kube-api-access-bx2s7\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-conf-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkgdh\" (UniqueName: \"kubernetes.io/projected/65679206-3d4d-45b3-a88a-b812c2e49d08-kube-api-access-bkgdh\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630220 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-kubelet\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-run-netns\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630269 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-cni-netd\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-env-overrides\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-host\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efcece22-1582-4835-82c7-1489ad265dca-multus-daemon-config\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.631082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-multus-certs\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630443 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-systemd-units\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-cni-bin\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovn-node-metrics-cert\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-modprobe-d\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6079e191-6467-414e-9381-0ffd91e44ab4-hosts-file\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-socket-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-var-lib-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.631983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.630635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-lib-modules\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.658347 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.658313 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:24:45 +0000 UTC" deadline="2028-01-23 17:27:55.993790926 +0000 UTC" Apr 16 14:29:46.658347 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.658340 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15530h58m9.335454949s" Apr 16 14:29:46.719259 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.719187 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 14:29:46.730792 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.730762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-slash\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.730950 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.730819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovnkube-script-lib\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.730950 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.730828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-slash\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.731065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.730952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3cf0a81b-ae4c-46b8-b16c-e93bd4e87102-agent-certs\") pod \"konnectivity-agent-mn4h5\" (UID: \"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102\") " pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.731065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.730983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-hostroot\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.731065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-hostroot\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.731209 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.731209 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqb7x\" (UniqueName: \"kubernetes.io/projected/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-kube-api-access-rqb7x\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.731209 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-node-log\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.731350 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1491249e-0f9d-4121-abb3-d99d4022d023-host\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.731350 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/38b0864f-2448-4948-b5f1-5231888bbbac-iptables-alerter-script\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.731350 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-node-log\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.731350 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-socket-dir-parent\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.731350 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-kubelet\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.731593 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1491249e-0f9d-4121-abb3-d99d4022d023-host\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.731674 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.731788 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-kubelet\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.731834 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-run\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.731879 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731848 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-socket-dir-parent\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.731982 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731948 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 14:29:46.732100 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.731993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-tuned\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.732100 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-run\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.732100 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx2s7\" (UniqueName: \"kubernetes.io/projected/b102258d-1ae2-4f45-910d-aa49b03d1a3b-kube-api-access-bx2s7\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.732237 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-conf-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.732607 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/38b0864f-2448-4948-b5f1-5231888bbbac-iptables-alerter-script\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.732607 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovnkube-script-lib\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.732745 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-conf-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.732745 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkgdh\" (UniqueName: \"kubernetes.io/projected/65679206-3d4d-45b3-a88a-b812c2e49d08-kube-api-access-bkgdh\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.732745 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cnibin\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.732878 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-kubelet\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.732878 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-run-netns\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.732878 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-cni-netd\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.732878 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733038 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-env-overrides\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733038 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-host\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.733038 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efcece22-1582-4835-82c7-1489ad265dca-multus-daemon-config\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.733038 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.732990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-multus-certs\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.733038 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-systemd-units\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-cni-bin\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovn-node-metrics-cert\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-modprobe-d\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6079e191-6467-414e-9381-0ffd91e44ab4-hosts-file\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-socket-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75flw\" (UniqueName: \"kubernetes.io/projected/bfa07533-d734-4829-bde0-6c0327bd79a9-kube-api-access-75flw\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:46.733229 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-host\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-system-cni-dir\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-var-lib-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-lib-modules\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-684pl\" (UniqueName: \"kubernetes.io/projected/1491249e-0f9d-4121-abb3-d99d4022d023-kube-api-access-684pl\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-kubelet\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j24ws\" (UniqueName: \"kubernetes.io/projected/38b0864f-2448-4948-b5f1-5231888bbbac-kube-api-access-j24ws\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733393 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-run-netns\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efcece22-1582-4835-82c7-1489ad265dca-cni-binary-copy\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-cni-bin\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-sys-fs\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.733503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733502 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-etc-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-ovn\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3cf0a81b-ae4c-46b8-b16c-e93bd4e87102-konnectivity-ca\") pod \"konnectivity-agent-mn4h5\" (UID: \"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102\") " pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-os-release\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cni-binary-copy\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-systemd\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733767 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xhn7l\" (UniqueName: \"kubernetes.io/projected/6079e191-6467-414e-9381-0ffd91e44ab4-kube-api-access-xhn7l\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pff2r\" (UniqueName: \"kubernetes.io/projected/efcece22-1582-4835-82c7-1489ad265dca-kube-api-access-pff2r\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:46.733943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/efcece22-1582-4835-82c7-1489ad265dca-cni-binary-copy\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.733993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-kubernetes\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b102258d-1ae2-4f45-910d-aa49b03d1a3b-tmp\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-registration-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-cni-multus\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-log-socket\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovnkube-config\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysctl-conf\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734215 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr2js\" (UniqueName: \"kubernetes.io/projected/312033f4-d50f-4d5d-a1ca-6e77e0428786-kube-api-access-fr2js\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-var-lib-kubelet\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6079e191-6467-414e-9381-0ffd91e44ab4-tmp-dir\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-etc-selinux\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysconfig\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/efcece22-1582-4835-82c7-1489ad265dca-multus-daemon-config\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-systemd\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-os-release\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-multus-certs\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-netns\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-systemd-units\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-netns\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-sys\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-cnibin\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-cni-netd\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-k8s-cni-cncf-io\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734633 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-etc-kubernetes\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/38b0864f-2448-4948-b5f1-5231888bbbac-host-slash\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-system-cni-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-device-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysctl-d\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1491249e-0f9d-4121-abb3-d99d4022d023-serviceca\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.734969 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-cni-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.734979 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-multus-cni-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-cni-bin\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-env-overrides\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-sys-fs\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-etc-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-ovn\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-registration-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735600 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-cni-multus\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-var-lib-kubelet\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735689 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-log-socket\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6079e191-6467-414e-9381-0ffd91e44ab4-tmp-dir\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.735911 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-modprobe-d\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6079e191-6467-414e-9381-0ffd91e44ab4-hosts-file\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-etc-selinux\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3cf0a81b-ae4c-46b8-b16c-e93bd4e87102-agent-certs\") pod \"konnectivity-agent-mn4h5\" (UID: \"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102\") " pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-socket-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovnkube-config\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.737863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-os-release\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736564 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-var-lib-cni-bin\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysctl-d\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysconfig\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-sysctl-conf\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-systemd\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/38b0864f-2448-4948-b5f1-5231888bbbac-host-slash\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736853 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-cnibin\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736928 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-systemd\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.736991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3cf0a81b-ae4c-46b8-b16c-e93bd4e87102-konnectivity-ca\") pod \"konnectivity-agent-mn4h5\" (UID: \"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102\") " pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-lib-modules\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-var-lib-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-host-run-k8s-cni-cncf-io\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737077 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/312033f4-d50f-4d5d-a1ca-6e77e0428786-run-openvswitch\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-system-cni-dir\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-kubernetes\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.738709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/efcece22-1582-4835-82c7-1489ad265dca-etc-kubernetes\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b102258d-1ae2-4f45-910d-aa49b03d1a3b-sys\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/65679206-3d4d-45b3-a88a-b812c2e49d08-device-dir\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737676 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1491249e-0f9d-4121-abb3-d99d4022d023-serviceca\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.737698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/312033f4-d50f-4d5d-a1ca-6e77e0428786-ovn-node-metrics-cert\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.738493 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b102258d-1ae2-4f45-910d-aa49b03d1a3b-tmp\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.739549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.739287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b102258d-1ae2-4f45-910d-aa49b03d1a3b-etc-tuned\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.743647 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.743623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx2s7\" (UniqueName: \"kubernetes.io/projected/b102258d-1ae2-4f45-910d-aa49b03d1a3b-kube-api-access-bx2s7\") pod \"tuned-wg4rx\" (UID: \"b102258d-1ae2-4f45-910d-aa49b03d1a3b\") " pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.745324 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.745296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkgdh\" (UniqueName: \"kubernetes.io/projected/65679206-3d4d-45b3-a88a-b812c2e49d08-kube-api-access-bkgdh\") pod \"aws-ebs-csi-driver-node-pp5jb\" (UID: \"65679206-3d4d-45b3-a88a-b812c2e49d08\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.745498 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.745476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pff2r\" (UniqueName: \"kubernetes.io/projected/efcece22-1582-4835-82c7-1489ad265dca-kube-api-access-pff2r\") pod \"multus-7cn8b\" (UID: \"efcece22-1582-4835-82c7-1489ad265dca\") " pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.746186 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.746136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhn7l\" (UniqueName: \"kubernetes.io/projected/6079e191-6467-414e-9381-0ffd91e44ab4-kube-api-access-xhn7l\") pod \"node-resolver-x7zbw\" (UID: \"6079e191-6467-414e-9381-0ffd91e44ab4\") " pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.746458 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.746438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-684pl\" (UniqueName: \"kubernetes.io/projected/1491249e-0f9d-4121-abb3-d99d4022d023-kube-api-access-684pl\") pod \"node-ca-29nkz\" (UID: \"1491249e-0f9d-4121-abb3-d99d4022d023\") " pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.746743 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.746720 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j24ws\" (UniqueName: \"kubernetes.io/projected/38b0864f-2448-4948-b5f1-5231888bbbac-kube-api-access-j24ws\") pod \"iptables-alerter-cn9zb\" (UID: \"38b0864f-2448-4948-b5f1-5231888bbbac\") " pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.746953 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.746933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr2js\" (UniqueName: \"kubernetes.io/projected/312033f4-d50f-4d5d-a1ca-6e77e0428786-kube-api-access-fr2js\") pod \"ovnkube-node-p7qpv\" (UID: \"312033f4-d50f-4d5d-a1ca-6e77e0428786\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.755619 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.755597 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 14:29:46.835450 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:46.835450 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-os-release\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835479 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cni-binary-copy\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-os-release\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.835611 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.835706 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.835708 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:47.335658887 +0000 UTC m=+3.162369285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:46.836026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqb7x\" (UniqueName: \"kubernetes.io/projected/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-kube-api-access-rqb7x\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cnibin\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75flw\" (UniqueName: \"kubernetes.io/projected/bfa07533-d734-4829-bde0-6c0327bd79a9-kube-api-access-75flw\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:46.836026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-system-cni-dir\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.835885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-system-cni-dir\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836315 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.836119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836315 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.836139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cnibin\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836315 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.836167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cni-binary-copy\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836471 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.836371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.836471 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.836428 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.843674 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.843646 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:46.843674 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.843674 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:46.843853 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.843690 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7lldq for pod openshift-network-diagnostics/network-check-target-4bcxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:46.843853 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:46.843773 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq podName:da437e36-c272-4ed1-b496-1ae5412f861e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:47.343753819 +0000 UTC m=+3.170464238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7lldq" (UniqueName: "kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq") pod "network-check-target-4bcxv" (UID: "da437e36-c272-4ed1-b496-1ae5412f861e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:46.846130 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.846105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqb7x\" (UniqueName: \"kubernetes.io/projected/0a9bf7c4-4e80-4445-84c8-06bf12cc01b8-kube-api-access-rqb7x\") pod \"multus-additional-cni-plugins-vnzc4\" (UID: \"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8\") " pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:46.846277 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.846244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75flw\" (UniqueName: \"kubernetes.io/projected/bfa07533-d734-4829-bde0-6c0327bd79a9-kube-api-access-75flw\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:46.916026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.915992 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7cn8b" Apr 16 14:29:46.925073 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.925038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:29:46.935860 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.935838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:29:46.940502 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.940481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" Apr 16 14:29:46.947156 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.947134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" Apr 16 14:29:46.952778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.952756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x7zbw" Apr 16 14:29:46.959356 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.959330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-29nkz" Apr 16 14:29:46.965935 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.965917 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cn9zb" Apr 16 14:29:46.971588 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:46.971493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" Apr 16 14:29:47.338332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.338312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:47.338482 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.338462 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:47.338561 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.338544 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:48.338512836 +0000 UTC m=+4.165223244 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:47.350724 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.350691 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cf0a81b_ae4c_46b8_b16c_e93bd4e87102.slice/crio-380e7760f18f9718c3de70d055efc9b6744cf16469f99055d45773f9342ecaa0 WatchSource:0}: Error finding container 380e7760f18f9718c3de70d055efc9b6744cf16469f99055d45773f9342ecaa0: Status 404 returned error can't find the container with id 380e7760f18f9718c3de70d055efc9b6744cf16469f99055d45773f9342ecaa0 Apr 16 14:29:47.351492 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.351460 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefcece22_1582_4835_82c7_1489ad265dca.slice/crio-3e47234e94f311cd97e7e7edb20ca0384fdf971d00858146d96225ef9134d699 WatchSource:0}: Error finding container 3e47234e94f311cd97e7e7edb20ca0384fdf971d00858146d96225ef9134d699: Status 404 returned error can't find the container with id 3e47234e94f311cd97e7e7edb20ca0384fdf971d00858146d96225ef9134d699 Apr 16 14:29:47.352674 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.352637 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6079e191_6467_414e_9381_0ffd91e44ab4.slice/crio-4f3136dce32a12b78f0f9c244ff4e620a38e4fe54f87342f3b4bce81ae036f46 WatchSource:0}: Error finding container 4f3136dce32a12b78f0f9c244ff4e620a38e4fe54f87342f3b4bce81ae036f46: Status 404 returned error can't find the container with id 4f3136dce32a12b78f0f9c244ff4e620a38e4fe54f87342f3b4bce81ae036f46 Apr 16 14:29:47.353687 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.353662 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312033f4_d50f_4d5d_a1ca_6e77e0428786.slice/crio-40954c4982d76d3f704bb99168485cc97a3aefa386f4b9de4267538984491141 WatchSource:0}: Error finding container 40954c4982d76d3f704bb99168485cc97a3aefa386f4b9de4267538984491141: Status 404 returned error can't find the container with id 40954c4982d76d3f704bb99168485cc97a3aefa386f4b9de4267538984491141 Apr 16 14:29:47.354713 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.354609 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a9bf7c4_4e80_4445_84c8_06bf12cc01b8.slice/crio-44f05422f7274bb609d825f2a9229223c160ec9e3dd081c0710055d74920e7b0 WatchSource:0}: Error finding container 44f05422f7274bb609d825f2a9229223c160ec9e3dd081c0710055d74920e7b0: Status 404 returned error can't find the container with id 44f05422f7274bb609d825f2a9229223c160ec9e3dd081c0710055d74920e7b0 Apr 16 14:29:47.358567 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.358505 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65679206_3d4d_45b3_a88a_b812c2e49d08.slice/crio-1aa53fda70162ab839e14402a09375c57f1fbd7e1ab4e741f6a85417e8343328 WatchSource:0}: Error finding container 1aa53fda70162ab839e14402a09375c57f1fbd7e1ab4e741f6a85417e8343328: Status 404 returned error can't find the container with id 1aa53fda70162ab839e14402a09375c57f1fbd7e1ab4e741f6a85417e8343328 Apr 16 14:29:47.359553 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.359499 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb102258d_1ae2_4f45_910d_aa49b03d1a3b.slice/crio-aa34ba27e6de26150b5fb1f8cc36244e084a57560e4e657521a90d22ff365de0 WatchSource:0}: Error finding container aa34ba27e6de26150b5fb1f8cc36244e084a57560e4e657521a90d22ff365de0: Status 404 returned error can't find the container with id aa34ba27e6de26150b5fb1f8cc36244e084a57560e4e657521a90d22ff365de0 Apr 16 14:29:47.360548 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.360507 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b0864f_2448_4948_b5f1_5231888bbbac.slice/crio-2efc89934bc3fd4d209f6ab5849cc27c57103665d62d5f407167c78c1fde0fb8 WatchSource:0}: Error finding container 2efc89934bc3fd4d209f6ab5849cc27c57103665d62d5f407167c78c1fde0fb8: Status 404 returned error can't find the container with id 2efc89934bc3fd4d209f6ab5849cc27c57103665d62d5f407167c78c1fde0fb8 Apr 16 14:29:47.361565 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:29:47.361526 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1491249e_0f9d_4121_abb3_d99d4022d023.slice/crio-de5252217acb12fe6b71bc95cbfe0372b73ec7ec08e6c968d503eb5c28a83e79 WatchSource:0}: Error finding container de5252217acb12fe6b71bc95cbfe0372b73ec7ec08e6c968d503eb5c28a83e79: Status 404 returned error can't find the container with id de5252217acb12fe6b71bc95cbfe0372b73ec7ec08e6c968d503eb5c28a83e79 Apr 16 14:29:47.438931 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.438704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:47.439016 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.438865 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:47.439016 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.438987 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:47.439016 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.439003 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7lldq for pod openshift-network-diagnostics/network-check-target-4bcxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:47.439126 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.439065 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq podName:da437e36-c272-4ed1-b496-1ae5412f861e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:48.439045426 +0000 UTC m=+4.265755839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lldq" (UniqueName: "kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq") pod "network-check-target-4bcxv" (UID: "da437e36-c272-4ed1-b496-1ae5412f861e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:47.659511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.659395 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 14:24:45 +0000 UTC" deadline="2027-11-16 22:05:10.629718659 +0000 UTC" Apr 16 14:29:47.659511 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.659430 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13903h35m22.970292967s" Apr 16 14:29:47.777415 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.776903 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:47.777415 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:47.777035 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:47.795082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.794973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" event={"ID":"b102258d-1ae2-4f45-910d-aa49b03d1a3b","Type":"ContainerStarted","Data":"aa34ba27e6de26150b5fb1f8cc36244e084a57560e4e657521a90d22ff365de0"} Apr 16 14:29:47.800031 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.799948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"40954c4982d76d3f704bb99168485cc97a3aefa386f4b9de4267538984491141"} Apr 16 14:29:47.807599 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.807403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x7zbw" event={"ID":"6079e191-6467-414e-9381-0ffd91e44ab4","Type":"ContainerStarted","Data":"4f3136dce32a12b78f0f9c244ff4e620a38e4fe54f87342f3b4bce81ae036f46"} Apr 16 14:29:47.818580 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.818483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7cn8b" event={"ID":"efcece22-1582-4835-82c7-1489ad265dca","Type":"ContainerStarted","Data":"3e47234e94f311cd97e7e7edb20ca0384fdf971d00858146d96225ef9134d699"} Apr 16 14:29:47.821007 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.820906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" event={"ID":"65679206-3d4d-45b3-a88a-b812c2e49d08","Type":"ContainerStarted","Data":"1aa53fda70162ab839e14402a09375c57f1fbd7e1ab4e741f6a85417e8343328"} Apr 16 14:29:47.841928 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.841885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerStarted","Data":"44f05422f7274bb609d825f2a9229223c160ec9e3dd081c0710055d74920e7b0"} Apr 16 14:29:47.850315 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.850277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mn4h5" event={"ID":"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102","Type":"ContainerStarted","Data":"380e7760f18f9718c3de70d055efc9b6744cf16469f99055d45773f9342ecaa0"} Apr 16 14:29:47.856069 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.856008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" event={"ID":"a454133692b7d59775381b8452362b38","Type":"ContainerStarted","Data":"0c94e7fff4273491fc90143854fa0bc86cfbf49b26ad254ff8e68979a576858f"} Apr 16 14:29:47.879832 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.879697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-29nkz" event={"ID":"1491249e-0f9d-4121-abb3-d99d4022d023","Type":"ContainerStarted","Data":"de5252217acb12fe6b71bc95cbfe0372b73ec7ec08e6c968d503eb5c28a83e79"} Apr 16 14:29:47.887649 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:47.887590 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cn9zb" event={"ID":"38b0864f-2448-4948-b5f1-5231888bbbac","Type":"ContainerStarted","Data":"2efc89934bc3fd4d209f6ab5849cc27c57103665d62d5f407167c78c1fde0fb8"} Apr 16 14:29:48.349784 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:48.349205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:48.349784 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.349353 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:48.349784 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.349417 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:50.349397093 +0000 UTC m=+6.176107503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:48.449870 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:48.449830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:48.450055 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.449996 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:48.450055 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.450016 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:48.450055 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.450030 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7lldq for pod openshift-network-diagnostics/network-check-target-4bcxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:48.450218 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.450090 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq podName:da437e36-c272-4ed1-b496-1ae5412f861e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:50.450070894 +0000 UTC m=+6.276781291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lldq" (UniqueName: "kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq") pod "network-check-target-4bcxv" (UID: "da437e36-c272-4ed1-b496-1ae5412f861e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:48.779850 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:48.779766 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:48.780304 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:48.779913 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:48.914113 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:48.914073 2576 generic.go:358] "Generic (PLEG): container finished" podID="b93a143e50d6e55a1c399c4b395a32e8" containerID="f81fde84b60fafc68b53296b5d0379f799e8a712c68887f0d56114b9caf6239a" exitCode=0 Apr 16 14:29:48.915093 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:48.915061 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" event={"ID":"b93a143e50d6e55a1c399c4b395a32e8","Type":"ContainerDied","Data":"f81fde84b60fafc68b53296b5d0379f799e8a712c68887f0d56114b9caf6239a"} Apr 16 14:29:48.931717 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:48.930068 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-173.ec2.internal" podStartSLOduration=3.930046612 podStartE2EDuration="3.930046612s" podCreationTimestamp="2026-04-16 14:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:29:47.881381911 +0000 UTC m=+3.708092329" watchObservedRunningTime="2026-04-16 14:29:48.930046612 +0000 UTC m=+4.756757024" Apr 16 14:29:49.777948 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:49.777394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:49.777948 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:49.777544 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:49.921051 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:49.921012 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" event={"ID":"b93a143e50d6e55a1c399c4b395a32e8","Type":"ContainerStarted","Data":"4937d4cf3cd4e2e6d7b294a1ed719e2c7a97817a5041e8f7ea441f2ace693436"} Apr 16 14:29:50.367294 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:50.367252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:50.367489 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.367453 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:50.367568 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.367523 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:54.367504833 +0000 UTC m=+10.194215233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:50.468245 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:50.468099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:50.468420 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.468292 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:50.468420 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.468313 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:50.468420 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.468325 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7lldq for pod openshift-network-diagnostics/network-check-target-4bcxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:50.468420 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.468390 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq podName:da437e36-c272-4ed1-b496-1ae5412f861e nodeName:}" failed. No retries permitted until 2026-04-16 14:29:54.468370528 +0000 UTC m=+10.295080931 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lldq" (UniqueName: "kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq") pod "network-check-target-4bcxv" (UID: "da437e36-c272-4ed1-b496-1ae5412f861e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:50.781339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:50.780820 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:50.781339 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:50.780947 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:51.777736 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:51.777700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:51.778197 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:51.777837 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:52.781208 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:52.780724 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:52.781208 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:52.780856 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:53.776817 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:53.776788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:53.777010 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:53.776917 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:54.223045 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.222831 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-173.ec2.internal" podStartSLOduration=9.222810584 podStartE2EDuration="9.222810584s" podCreationTimestamp="2026-04-16 14:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:29:49.936259143 +0000 UTC m=+5.762969560" watchObservedRunningTime="2026-04-16 14:29:54.222810584 +0000 UTC m=+10.049521001" Apr 16 14:29:54.223501 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.223163 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xnzwr"] Apr 16 14:29:54.226411 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.226340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.226580 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.226427 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:29:54.300077 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.300016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-dbus\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.300077 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.300076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-kubelet-config\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.300330 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.300169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.400613 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.400574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.400787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.400671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:54.400787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.400701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-dbus\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.400787 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.400743 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:54.400956 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.400823 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret podName:f35ee2aa-f1ac-4d97-bf10-86c8f12bc700 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:54.900799804 +0000 UTC m=+10.727510256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret") pod "global-pull-secret-syncer-xnzwr" (UID: "f35ee2aa-f1ac-4d97-bf10-86c8f12bc700") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:54.400956 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.400820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-kubelet-config\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.400956 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.400746 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-kubelet-config\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.400956 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.400828 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:54.400956 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.400919 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:02.400906683 +0000 UTC m=+18.227617091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:29:54.401173 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.400962 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-dbus\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.502332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.502139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:54.502493 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.502341 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:29:54.502493 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.502358 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:29:54.502493 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.502372 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7lldq for pod openshift-network-diagnostics/network-check-target-4bcxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:54.502493 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.502433 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq podName:da437e36-c272-4ed1-b496-1ae5412f861e nodeName:}" failed. No retries permitted until 2026-04-16 14:30:02.50241456 +0000 UTC m=+18.329124976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lldq" (UniqueName: "kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq") pod "network-check-target-4bcxv" (UID: "da437e36-c272-4ed1-b496-1ae5412f861e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:29:54.780787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.780635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:54.780948 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.780808 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:54.904694 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:54.904617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:54.904848 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.904773 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:54.904911 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:54.904851 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret podName:f35ee2aa-f1ac-4d97-bf10-86c8f12bc700 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:55.904830328 +0000 UTC m=+11.731540728 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret") pod "global-pull-secret-syncer-xnzwr" (UID: "f35ee2aa-f1ac-4d97-bf10-86c8f12bc700") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:55.777607 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:55.777570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:55.778063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:55.777570 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:55.778063 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:55.777698 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:55.778063 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:55.777816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:29:55.913745 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:55.913703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:55.913949 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:55.913868 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:55.913949 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:55.913934 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret podName:f35ee2aa-f1ac-4d97-bf10-86c8f12bc700 nodeName:}" failed. No retries permitted until 2026-04-16 14:29:57.91391495 +0000 UTC m=+13.740625354 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret") pod "global-pull-secret-syncer-xnzwr" (UID: "f35ee2aa-f1ac-4d97-bf10-86c8f12bc700") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:56.777817 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:56.777725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:56.778256 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:56.777884 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:57.776869 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:57.776830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:57.777044 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:57.776830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:57.777121 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:57.777069 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:57.777121 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:57.776957 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:29:57.930712 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:57.930516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:57.931099 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:57.930674 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:57.931099 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:57.930812 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret podName:f35ee2aa-f1ac-4d97-bf10-86c8f12bc700 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:01.930790052 +0000 UTC m=+17.757500459 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret") pod "global-pull-secret-syncer-xnzwr" (UID: "f35ee2aa-f1ac-4d97-bf10-86c8f12bc700") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:29:58.777099 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:58.777061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:29:58.777279 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:58.777200 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:29:59.777026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:59.776999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:29:59.777400 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:29:59.776999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:29:59.777400 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:59.777103 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:29:59.777400 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:29:59.777157 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:00.777137 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:00.777098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:00.777623 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:00.777282 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:01.777268 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:01.777233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:01.777736 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:01.777357 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:01.777736 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:01.777449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:01.777736 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:01.777586 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:01.960510 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:01.960476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:01.960765 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:01.960629 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:30:01.960765 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:01.960693 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret podName:f35ee2aa-f1ac-4d97-bf10-86c8f12bc700 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:09.960672633 +0000 UTC m=+25.787383038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret") pod "global-pull-secret-syncer-xnzwr" (UID: "f35ee2aa-f1ac-4d97-bf10-86c8f12bc700") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:30:02.465146 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:02.465100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:02.465308 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.465235 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:02.465383 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.465309 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:18.465288651 +0000 UTC m=+34.291999049 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:30:02.566334 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:02.566291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:02.566522 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.566457 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:30:02.566522 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.566482 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:30:02.566522 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.566492 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7lldq for pod openshift-network-diagnostics/network-check-target-4bcxv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:02.566703 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.566567 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq podName:da437e36-c272-4ed1-b496-1ae5412f861e nodeName:}" failed. No retries permitted until 2026-04-16 14:30:18.566553366 +0000 UTC m=+34.393263760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7lldq" (UniqueName: "kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq") pod "network-check-target-4bcxv" (UID: "da437e36-c272-4ed1-b496-1ae5412f861e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:30:02.776909 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:02.776816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:02.777072 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:02.776959 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:03.777231 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:03.777194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:03.777716 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:03.777200 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:03.777716 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:03.777310 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:03.777716 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:03.777412 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:04.778270 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.778094 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:04.778961 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:04.778337 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:04.948163 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.947936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-29nkz" event={"ID":"1491249e-0f9d-4121-abb3-d99d4022d023","Type":"ContainerStarted","Data":"c0b6b956ea1f633d92eaa28b2098d5a1d8b9de28c99619ec1bee20b6dbb786bb"} Apr 16 14:30:04.949454 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.949424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" event={"ID":"b102258d-1ae2-4f45-910d-aa49b03d1a3b","Type":"ContainerStarted","Data":"cfaccff7f0d35d190fddf4bc8243dc606f4834ffb82375f1862a4294bf14bfe1"} Apr 16 14:30:04.951282 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.951262 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:30:04.951616 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.951592 2576 generic.go:358] "Generic (PLEG): container finished" podID="312033f4-d50f-4d5d-a1ca-6e77e0428786" containerID="3ddf7279ce022c4c968adcc0e30da4faddd226ea5ea689fa3c71c58d13612efb" exitCode=1 Apr 16 14:30:04.951726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.951658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"3f3cea94651f4b4e03a8a9343a1ce505c678f912a648987228b94664e5341227"} Apr 16 14:30:04.951726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.951692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerDied","Data":"3ddf7279ce022c4c968adcc0e30da4faddd226ea5ea689fa3c71c58d13612efb"} Apr 16 14:30:04.951726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.951708 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"fbd64938b22da38c5201f604b69ed333d3bbfc2d9d4e4508ec19d563a47f3336"} Apr 16 14:30:04.953131 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.953109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x7zbw" event={"ID":"6079e191-6467-414e-9381-0ffd91e44ab4","Type":"ContainerStarted","Data":"cd327b9053a5d0e7bc785e6de4433cdbd11fb0f10d9bcba1263cd0cae28209f9"} Apr 16 14:30:04.954457 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.954433 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7cn8b" event={"ID":"efcece22-1582-4835-82c7-1489ad265dca","Type":"ContainerStarted","Data":"4c994d05ddc7ca7c199b752be8e985ec493517485065453ce7157876ca65b7a6"} Apr 16 14:30:04.955737 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.955717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" event={"ID":"65679206-3d4d-45b3-a88a-b812c2e49d08","Type":"ContainerStarted","Data":"7e80f09fb8f158b9e9334ad5b53d6105d6fe7f8b75f464f6ac0aa3d06c9c4697"} Apr 16 14:30:04.957129 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.957106 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a9bf7c4-4e80-4445-84c8-06bf12cc01b8" containerID="6685a043e543fe6ba4ffc6d7413726d96b2b1bb8b7e4583b29c60d7a15afb7f4" exitCode=0 Apr 16 14:30:04.957226 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.957196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerDied","Data":"6685a043e543fe6ba4ffc6d7413726d96b2b1bb8b7e4583b29c60d7a15afb7f4"} Apr 16 14:30:04.958574 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.958524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mn4h5" event={"ID":"3cf0a81b-ae4c-46b8-b16c-e93bd4e87102","Type":"ContainerStarted","Data":"83c712c2be85f85d54d18ffcaf13f076e8e6ef1dc9783537ed492123f84235fb"} Apr 16 14:30:04.964050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.964014 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-29nkz" podStartSLOduration=4.174793553 podStartE2EDuration="20.964002729s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.363668837 +0000 UTC m=+3.190379232" lastFinishedPulling="2026-04-16 14:30:04.152878008 +0000 UTC m=+19.979588408" observedRunningTime="2026-04-16 14:30:04.963569242 +0000 UTC m=+20.790279660" watchObservedRunningTime="2026-04-16 14:30:04.964002729 +0000 UTC m=+20.790713146" Apr 16 14:30:04.977208 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.977167 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x7zbw" podStartSLOduration=4.181724055 podStartE2EDuration="20.977147767s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.357420598 +0000 UTC m=+3.184130994" lastFinishedPulling="2026-04-16 14:30:04.152844297 +0000 UTC m=+19.979554706" observedRunningTime="2026-04-16 14:30:04.977146192 +0000 UTC m=+20.803856610" watchObservedRunningTime="2026-04-16 14:30:04.977147767 +0000 UTC m=+20.803858176" Apr 16 14:30:04.990898 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:04.990847 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mn4h5" podStartSLOduration=4.190440123 podStartE2EDuration="20.990833028s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.352421933 +0000 UTC m=+3.179132352" lastFinishedPulling="2026-04-16 14:30:04.152814847 +0000 UTC m=+19.979525257" observedRunningTime="2026-04-16 14:30:04.990821156 +0000 UTC m=+20.817531573" watchObservedRunningTime="2026-04-16 14:30:04.990833028 +0000 UTC m=+20.817543438" Apr 16 14:30:05.027782 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.027727 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7cn8b" podStartSLOduration=4.193429062 podStartE2EDuration="21.027711941s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.355311906 +0000 UTC m=+3.182022318" lastFinishedPulling="2026-04-16 14:30:04.189594783 +0000 UTC m=+20.016305197" observedRunningTime="2026-04-16 14:30:05.027602104 +0000 UTC m=+20.854312512" watchObservedRunningTime="2026-04-16 14:30:05.027711941 +0000 UTC m=+20.854422357" Apr 16 14:30:05.048916 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.048869 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wg4rx" podStartSLOduration=4.255041155 podStartE2EDuration="21.04885685s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.361562582 +0000 UTC m=+3.188272981" lastFinishedPulling="2026-04-16 14:30:04.155378267 +0000 UTC m=+19.982088676" observedRunningTime="2026-04-16 14:30:05.048609307 +0000 UTC m=+20.875319724" watchObservedRunningTime="2026-04-16 14:30:05.04885685 +0000 UTC m=+20.875567267" Apr 16 14:30:05.738923 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.738893 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:30:05.740024 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.739995 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:30:05.777025 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.776997 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:05.777158 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.777041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:05.777158 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:05.777129 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:05.777275 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:05.777219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:05.798325 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.798198 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 14:30:05.961680 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.961594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cn9zb" event={"ID":"38b0864f-2448-4948-b5f1-5231888bbbac","Type":"ContainerStarted","Data":"aea601e209ee0117918e4f429468489c8a091ec2f07c32cc82565625a5539fc2"} Apr 16 14:30:05.964522 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.964493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:30:05.964961 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.964919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"d11a3e2157e139599a8bd5f9b37ebc7cbf792e154e5e8133264df5a2244e880e"} Apr 16 14:30:05.964961 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.964958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"31623964ea1da451faef8eca8d3adcecacd6b217c2eb96884cf65f0a6546be6d"} Apr 16 14:30:05.965164 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.964972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"5b1a3867fff2e5ccc22f4756b46b50658db7b01db6a5081204fef8a0d42cbefc"} Apr 16 14:30:05.966908 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.966869 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" event={"ID":"65679206-3d4d-45b3-a88a-b812c2e49d08","Type":"ContainerStarted","Data":"a3b758915699493ed98db7e4bb6e3965625a88779c93704b7d633fc71091b9b7"} Apr 16 14:30:05.967667 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.967645 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:30:05.968252 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.968236 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mn4h5" Apr 16 14:30:05.979653 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:05.979604 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cn9zb" podStartSLOduration=5.188736051 podStartE2EDuration="21.979586868s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.362610837 +0000 UTC m=+3.189321250" lastFinishedPulling="2026-04-16 14:30:04.153461671 +0000 UTC m=+19.980172067" observedRunningTime="2026-04-16 14:30:05.979068245 +0000 UTC m=+21.805778663" watchObservedRunningTime="2026-04-16 14:30:05.979586868 +0000 UTC m=+21.806297287" Apr 16 14:30:06.696100 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:06.695997 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T14:30:05.798217246Z","UUID":"72de7866-784a-4b3f-8de0-7bed45e24134","Handler":null,"Name":"","Endpoint":""} Apr 16 14:30:06.698218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:06.698177 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 14:30:06.698218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:06.698211 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 14:30:06.777048 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:06.777025 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:06.777207 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:06.777156 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:06.971386 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:06.971080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" event={"ID":"65679206-3d4d-45b3-a88a-b812c2e49d08","Type":"ContainerStarted","Data":"37128086b70682231a86c585ad9c160b9ae1c79d5994b4a89f2d6bbbc74ae25d"} Apr 16 14:30:06.988479 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:06.988417 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pp5jb" podStartSLOduration=3.575469482 podStartE2EDuration="22.988396954s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.360634247 +0000 UTC m=+3.187344654" lastFinishedPulling="2026-04-16 14:30:06.773561722 +0000 UTC m=+22.600272126" observedRunningTime="2026-04-16 14:30:06.988128283 +0000 UTC m=+22.814838699" watchObservedRunningTime="2026-04-16 14:30:06.988396954 +0000 UTC m=+22.815107385" Apr 16 14:30:07.777019 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:07.776985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:07.777019 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:07.777012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:07.777279 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:07.777102 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:07.777279 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:07.777165 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:07.976181 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:07.976148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:30:07.976712 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:07.976559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"285108829e2816da816c94d1080250f613efe52b31f7a58bd7298aed4a672357"} Apr 16 14:30:08.777468 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:08.777430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:08.777680 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:08.777566 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:09.777138 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.777103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:09.777760 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.777104 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:09.777760 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:09.777217 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:09.777760 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:09.777316 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:09.983154 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.983050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:30:09.983489 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.983469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"35e27d4b29a2ef38048ee0318395797a7d005e326babc62ee808b4a7835833ed"} Apr 16 14:30:09.983867 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.983821 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:30:09.984181 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.984161 2576 scope.go:117] "RemoveContainer" containerID="3ddf7279ce022c4c968adcc0e30da4faddd226ea5ea689fa3c71c58d13612efb" Apr 16 14:30:09.985305 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.985284 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a9bf7c4-4e80-4445-84c8-06bf12cc01b8" containerID="6e3009915a8db2b38603fc88630a75464a05d7b9847a773bdee561140a1171b5" exitCode=0 Apr 16 14:30:09.985405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:09.985316 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerDied","Data":"6e3009915a8db2b38603fc88630a75464a05d7b9847a773bdee561140a1171b5"} Apr 16 14:30:10.001316 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.001293 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:30:10.024352 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.024328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:10.024453 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:10.024438 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 14:30:10.024512 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:10.024503 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret podName:f35ee2aa-f1ac-4d97-bf10-86c8f12bc700 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:26.024489642 +0000 UTC m=+41.851200037 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret") pod "global-pull-secret-syncer-xnzwr" (UID: "f35ee2aa-f1ac-4d97-bf10-86c8f12bc700") : object "kube-system"/"original-pull-secret" not registered Apr 16 14:30:10.777066 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.777037 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:10.777244 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:10.777153 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:10.991694 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.991663 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:30:10.992101 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.992066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" event={"ID":"312033f4-d50f-4d5d-a1ca-6e77e0428786","Type":"ContainerStarted","Data":"11b5669f50916ab757203a4ed169ddbec5d3ac1882e433a879798d428196aa26"} Apr 16 14:30:10.992720 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.992700 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:30:10.992800 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.992748 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:30:10.996367 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.996337 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a9bf7c4-4e80-4445-84c8-06bf12cc01b8" containerID="928dd45ba2dee97b2ead16ca191c28cbc9fbeef57d926622c1619e160ff4607c" exitCode=0 Apr 16 14:30:10.996519 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:10.996382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerDied","Data":"928dd45ba2dee97b2ead16ca191c28cbc9fbeef57d926622c1619e160ff4607c"} Apr 16 14:30:11.024808 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.024754 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" podStartSLOduration=9.892484937999999 podStartE2EDuration="27.024739068s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.358518154 +0000 UTC m=+3.185228550" lastFinishedPulling="2026-04-16 14:30:04.490772285 +0000 UTC m=+20.317482680" observedRunningTime="2026-04-16 14:30:11.024195912 +0000 UTC m=+26.850906328" watchObservedRunningTime="2026-04-16 14:30:11.024739068 +0000 UTC m=+26.851449484" Apr 16 14:30:11.025132 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.025112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:30:11.668323 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.668077 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbtb7"] Apr 16 14:30:11.668482 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.668427 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:11.668588 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:11.668563 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:11.676928 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.676892 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xnzwr"] Apr 16 14:30:11.677074 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.677041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:11.677173 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:11.677150 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:11.677636 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.677613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4bcxv"] Apr 16 14:30:11.677745 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.677728 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:11.677844 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:11.677823 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:12.000014 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:11.999923 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a9bf7c4-4e80-4445-84c8-06bf12cc01b8" containerID="918bb4b3381402722fff2fb3450bc81b0056564a52b587d3c44750c1b0ec77b9" exitCode=0 Apr 16 14:30:12.000385 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:12.000009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerDied","Data":"918bb4b3381402722fff2fb3450bc81b0056564a52b587d3c44750c1b0ec77b9"} Apr 16 14:30:13.776755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:13.776721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:13.777232 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:13.776771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:13.777232 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:13.776862 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:13.777232 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:13.776946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:13.777232 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:13.777031 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:13.777232 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:13.776944 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:15.777641 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:15.777607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:15.778404 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:15.777607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:15.778404 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:15.777732 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4bcxv" podUID="da437e36-c272-4ed1-b496-1ae5412f861e" Apr 16 14:30:15.778404 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:15.777848 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:30:15.778404 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:15.777608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:15.778404 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:15.777964 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xnzwr" podUID="f35ee2aa-f1ac-4d97-bf10-86c8f12bc700" Apr 16 14:30:17.448889 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.448856 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-173.ec2.internal" event="NodeReady" Apr 16 14:30:17.449398 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.449035 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:30:17.494781 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.494746 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-99g2n"] Apr 16 14:30:17.519514 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.519485 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2tkjc"] Apr 16 14:30:17.519706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.519653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.521997 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.521978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:30:17.522715 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.522699 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dqsrn\"" Apr 16 14:30:17.522897 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.522865 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:30:17.542061 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.542037 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-99g2n"] Apr 16 14:30:17.542061 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.542062 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2tkjc"] Apr 16 14:30:17.542176 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.542159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:17.544361 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.544342 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:30:17.544476 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.544364 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:30:17.544476 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.544424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zpkvj\"" Apr 16 14:30:17.544476 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.544436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:30:17.685549 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.685503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jmq\" (UniqueName: \"kubernetes.io/projected/8e9a80d4-54ef-459f-b21f-e013f853f63d-kube-api-access-c7jmq\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:17.685708 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.685555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c18fae62-a7e8-473e-a375-928abb494bf2-tmp-dir\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.685708 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.685625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.685708 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.685676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c18fae62-a7e8-473e-a375-928abb494bf2-config-volume\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.685803 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.685725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:17.685803 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.685746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vzx\" (UniqueName: \"kubernetes.io/projected/c18fae62-a7e8-473e-a375-928abb494bf2-kube-api-access-p6vzx\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.777576 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.777440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:17.777710 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.777440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:17.777710 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.777440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:17.780781 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.780752 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:30:17.780922 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.780872 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:30:17.781001 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.780988 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bl9dn\"" Apr 16 14:30:17.781503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.781488 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x8s5q\"" Apr 16 14:30:17.781588 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.781576 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:30:17.781785 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.781764 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:30:17.786492 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jmq\" (UniqueName: \"kubernetes.io/projected/8e9a80d4-54ef-459f-b21f-e013f853f63d-kube-api-access-c7jmq\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:17.786591 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c18fae62-a7e8-473e-a375-928abb494bf2-tmp-dir\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.786591 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786551 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.786693 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c18fae62-a7e8-473e-a375-928abb494bf2-config-volume\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.786693 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:17.786693 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vzx\" (UniqueName: \"kubernetes.io/projected/c18fae62-a7e8-473e-a375-928abb494bf2-kube-api-access-p6vzx\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.786858 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:17.786722 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:17.786858 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:17.786794 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:18.286772217 +0000 UTC m=+34.113482630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:17.786858 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.786810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c18fae62-a7e8-473e-a375-928abb494bf2-tmp-dir\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.787004 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:17.786904 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:17.787004 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:17.786958 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:30:18.286942502 +0000 UTC m=+34.113652924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:17.787193 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.787176 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c18fae62-a7e8-473e-a375-928abb494bf2-config-volume\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.798126 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.798107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vzx\" (UniqueName: \"kubernetes.io/projected/c18fae62-a7e8-473e-a375-928abb494bf2-kube-api-access-p6vzx\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:17.798373 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:17.798353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jmq\" (UniqueName: \"kubernetes.io/projected/8e9a80d4-54ef-459f-b21f-e013f853f63d-kube-api-access-c7jmq\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:18.290194 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.289933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:18.290322 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.290219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:18.290322 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:18.290076 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:18.290322 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:18.290300 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:18.290432 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:18.290347 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:19.290328858 +0000 UTC m=+35.117039260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:18.290432 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:18.290362 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:30:19.290354335 +0000 UTC m=+35.117064730 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:18.491850 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.491814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:18.492309 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:18.491945 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:30:18.492309 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:18.491998 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:50.491983 +0000 UTC m=+66.318693395 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : secret "metrics-daemon-secret" not found Apr 16 14:30:18.592733 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.592654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:18.595310 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.595283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lldq\" (UniqueName: \"kubernetes.io/projected/da437e36-c272-4ed1-b496-1ae5412f861e-kube-api-access-7lldq\") pod \"network-check-target-4bcxv\" (UID: \"da437e36-c272-4ed1-b496-1ae5412f861e\") " pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:18.698503 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.698469 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:18.868438 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:18.868408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4bcxv"] Apr 16 14:30:18.873126 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:30:18.873089 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda437e36_c272_4ed1_b496_1ae5412f861e.slice/crio-9618b5d66989928695628ae7e455db07c4605ff133a35270980b1d8a9410bd63 WatchSource:0}: Error finding container 9618b5d66989928695628ae7e455db07c4605ff133a35270980b1d8a9410bd63: Status 404 returned error can't find the container with id 9618b5d66989928695628ae7e455db07c4605ff133a35270980b1d8a9410bd63 Apr 16 14:30:19.016256 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:19.016222 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a9bf7c4-4e80-4445-84c8-06bf12cc01b8" containerID="e2c47eca5ec3387c595385e5f31bfd236bd31c91575f2db24cba3efbbe094c65" exitCode=0 Apr 16 14:30:19.016438 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:19.016297 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerDied","Data":"e2c47eca5ec3387c595385e5f31bfd236bd31c91575f2db24cba3efbbe094c65"} Apr 16 14:30:19.017554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:19.017242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4bcxv" event={"ID":"da437e36-c272-4ed1-b496-1ae5412f861e","Type":"ContainerStarted","Data":"9618b5d66989928695628ae7e455db07c4605ff133a35270980b1d8a9410bd63"} Apr 16 14:30:19.300236 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:19.300196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:19.300409 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:19.300284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:19.300409 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:19.300330 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:19.300409 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:19.300402 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:21.300381882 +0000 UTC m=+37.127092288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:19.300591 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:19.300423 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:19.300591 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:19.300480 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:30:21.300465932 +0000 UTC m=+37.127176327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:20.021959 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:20.021923 2576 generic.go:358] "Generic (PLEG): container finished" podID="0a9bf7c4-4e80-4445-84c8-06bf12cc01b8" containerID="4879f4c849db09ec0c9a1a9b02d6a92835d2592ae9975e10b55571de63fc26b3" exitCode=0 Apr 16 14:30:20.022492 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:20.021982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerDied","Data":"4879f4c849db09ec0c9a1a9b02d6a92835d2592ae9975e10b55571de63fc26b3"} Apr 16 14:30:21.028080 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:21.027845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" event={"ID":"0a9bf7c4-4e80-4445-84c8-06bf12cc01b8","Type":"ContainerStarted","Data":"de67ad8da7b8f76a8b78cb0034dfd7619f76471538801e14eafe69791b993c74"} Apr 16 14:30:21.055725 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:21.055629 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vnzc4" podStartSLOduration=6.470467906 podStartE2EDuration="37.055609398s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:29:47.357418096 +0000 UTC m=+3.184128505" lastFinishedPulling="2026-04-16 14:30:17.942559588 +0000 UTC m=+33.769269997" observedRunningTime="2026-04-16 14:30:21.053424944 +0000 UTC m=+36.880135375" watchObservedRunningTime="2026-04-16 14:30:21.055609398 +0000 UTC m=+36.882319814" Apr 16 14:30:21.316494 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:21.316405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:21.316678 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:21.316501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:21.316678 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:21.316582 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:21.316678 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:21.316625 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:21.316678 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:21.316661 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:25.316642553 +0000 UTC m=+41.143352949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:21.316678 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:21.316681 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:30:25.316671658 +0000 UTC m=+41.143382053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:22.031427 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:22.031395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4bcxv" event={"ID":"da437e36-c272-4ed1-b496-1ae5412f861e","Type":"ContainerStarted","Data":"18d3926c5a1db765e06d0bea30324c24d0d2d5f16bd7983fe3eb6d117f1e39c6"} Apr 16 14:30:23.033943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:23.033902 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:30:23.058307 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:23.058260 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4bcxv" podStartSLOduration=36.044155909 podStartE2EDuration="39.058246071s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:30:18.874988144 +0000 UTC m=+34.701698540" lastFinishedPulling="2026-04-16 14:30:21.889078303 +0000 UTC m=+37.715788702" observedRunningTime="2026-04-16 14:30:23.057698903 +0000 UTC m=+38.884409320" watchObservedRunningTime="2026-04-16 14:30:23.058246071 +0000 UTC m=+38.884956488" Apr 16 14:30:25.343603 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:25.343525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:25.343991 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:25.343642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:25.343991 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:25.343687 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:25.343991 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:25.343742 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:25.343991 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:25.343769 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:33.343748399 +0000 UTC m=+49.170458813 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:25.343991 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:25.343786 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:30:33.343778576 +0000 UTC m=+49.170488972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:26.048374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:26.048341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:26.052220 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:26.052188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f35ee2aa-f1ac-4d97-bf10-86c8f12bc700-original-pull-secret\") pod \"global-pull-secret-syncer-xnzwr\" (UID: \"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700\") " pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:26.193868 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:26.193827 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xnzwr" Apr 16 14:30:26.308372 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:26.308292 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xnzwr"] Apr 16 14:30:26.312145 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:30:26.312112 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35ee2aa_f1ac_4d97_bf10_86c8f12bc700.slice/crio-680627b9afcdb257d22caf01a5f2006e07b01eb447e4a0880a9795f803772141 WatchSource:0}: Error finding container 680627b9afcdb257d22caf01a5f2006e07b01eb447e4a0880a9795f803772141: Status 404 returned error can't find the container with id 680627b9afcdb257d22caf01a5f2006e07b01eb447e4a0880a9795f803772141 Apr 16 14:30:27.042485 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:27.042432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xnzwr" event={"ID":"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700","Type":"ContainerStarted","Data":"680627b9afcdb257d22caf01a5f2006e07b01eb447e4a0880a9795f803772141"} Apr 16 14:30:31.051860 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:31.051820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xnzwr" event={"ID":"f35ee2aa-f1ac-4d97-bf10-86c8f12bc700","Type":"ContainerStarted","Data":"c0b53681bfb9f1b2b2a43ee545a088b31eb89e3f043297ec6d7ae4e8eb105d51"} Apr 16 14:30:31.068436 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:31.068392 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xnzwr" podStartSLOduration=33.173734399 podStartE2EDuration="37.068377651s" podCreationTimestamp="2026-04-16 14:29:54 +0000 UTC" firstStartedPulling="2026-04-16 14:30:26.313878763 +0000 UTC m=+42.140589163" lastFinishedPulling="2026-04-16 14:30:30.208522019 +0000 UTC m=+46.035232415" observedRunningTime="2026-04-16 14:30:31.068289655 +0000 UTC m=+46.895000071" watchObservedRunningTime="2026-04-16 14:30:31.068377651 +0000 UTC m=+46.895088067" Apr 16 14:30:33.397677 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:33.397638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:33.398060 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:33.397689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:33.398060 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:33.397777 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:33.398060 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:33.397778 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:33.398060 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:33.397836 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:30:49.397818797 +0000 UTC m=+65.224529195 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:33.398060 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:33.397849 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:30:49.397843888 +0000 UTC m=+65.224554284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:43.013117 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:43.013084 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7qpv" Apr 16 14:30:49.497571 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:49.497497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:30:49.497963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:49.497597 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:30:49.497963 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:49.497710 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:30:49.497963 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:49.497771 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:21.497753501 +0000 UTC m=+97.324463897 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:30:49.497963 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:49.497720 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:30:49.497963 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:49.497835 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:31:21.497820955 +0000 UTC m=+97.324531352 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:30:50.503370 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:50.503314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:30:50.503793 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:50.503466 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:30:50.503793 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:30:50.503554 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:54.503519253 +0000 UTC m=+130.330229648 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : secret "metrics-daemon-secret" not found Apr 16 14:30:54.038898 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:30:54.038863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4bcxv" Apr 16 14:31:21.516258 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:21.516143 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:31:21.516258 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:21.516223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:31:21.516716 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:21.516313 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:31:21.516716 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:21.516392 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert podName:8e9a80d4-54ef-459f-b21f-e013f853f63d nodeName:}" failed. No retries permitted until 2026-04-16 14:32:25.516375858 +0000 UTC m=+161.343086257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert") pod "ingress-canary-2tkjc" (UID: "8e9a80d4-54ef-459f-b21f-e013f853f63d") : secret "canary-serving-cert" not found Apr 16 14:31:21.516716 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:21.516313 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:31:21.516716 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:21.516476 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls podName:c18fae62-a7e8-473e-a375-928abb494bf2 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:25.516460762 +0000 UTC m=+161.343171170 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls") pod "dns-default-99g2n" (UID: "c18fae62-a7e8-473e-a375-928abb494bf2") : secret "dns-default-metrics-tls" not found Apr 16 14:31:47.079204 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.079161 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6"] Apr 16 14:31:47.082204 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.082180 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5"] Apr 16 14:31:47.082354 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.082335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.085061 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.085041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.094232 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.094207 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.098840 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.098819 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:31:47.098960 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.098848 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:31:47.098960 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.098847 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.098960 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.098855 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-h4gbw\"" Apr 16 14:31:47.100940 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.100921 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.101047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.101033 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-g2nmb\"" Apr 16 14:31:47.104261 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.104243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.113832 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.113808 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 14:31:47.114851 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.114825 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-cnkw9"] Apr 16 14:31:47.117762 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.117741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6"] Apr 16 14:31:47.117762 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.117766 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5"] Apr 16 14:31:47.117899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.117880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.122498 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.122477 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:31:47.122652 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.122515 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-s9szl\"" Apr 16 14:31:47.123713 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.123693 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:31:47.126116 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.126099 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.134690 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.134672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.136459 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.136431 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:31:47.144570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.144545 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-cnkw9"] Apr 16 14:31:47.187644 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/05104d65-51f0-434e-945d-2714c95daadd-snapshots\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.187823 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9e2f0c20-f27f-4dee-a225-e224c0239ba4-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.187823 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05104d65-51f0-434e-945d-2714c95daadd-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.187823 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cd7h\" (UniqueName: \"kubernetes.io/projected/82558a12-6f14-43c9-848d-c3d942592e1d-kube-api-access-8cd7h\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.187923 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.187923 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187854 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05104d65-51f0-434e-945d-2714c95daadd-tmp\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.187923 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.187923 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05104d65-51f0-434e-945d-2714c95daadd-serving-cert\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.188041 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187923 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05104d65-51f0-434e-945d-2714c95daadd-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.188041 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.187977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w65cn\" (UniqueName: \"kubernetes.io/projected/9e2f0c20-f27f-4dee-a225-e224c0239ba4-kube-api-access-w65cn\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.188041 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.188009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsngd\" (UniqueName: \"kubernetes.io/projected/05104d65-51f0-434e-945d-2714c95daadd-kube-api-access-nsngd\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.194943 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.194914 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr"] Apr 16 14:31:47.197733 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.197715 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.199926 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.199899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 14:31:47.200071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.199935 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.200071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.199962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.200071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.199964 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zfn2s\"" Apr 16 14:31:47.200271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.200253 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 14:31:47.209165 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.209141 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr"] Apr 16 14:31:47.274112 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.274081 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k"] Apr 16 14:31:47.276887 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.276857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.277101 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.277080 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7f7fd7cc4b-cbgrm"] Apr 16 14:31:47.279963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.279940 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.280485 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.280466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.280604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.280484 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:31:47.280604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.280489 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.280604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.280556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:31:47.280869 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.280849 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-82k7j\"" Apr 16 14:31:47.282089 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.282070 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cxdtk\"" Apr 16 14:31:47.282181 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.282115 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 14:31:47.282714 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.282698 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 14:31:47.283002 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.282986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 14:31:47.283068 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.282991 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 14:31:47.283133 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.283065 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:31:47.283692 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.283674 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:31:47.285575 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.285557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k"] Apr 16 14:31:47.288889 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.288843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcn29\" (UniqueName: \"kubernetes.io/projected/1a2c84e9-4fda-4619-a86a-36c809b14446-kube-api-access-kcn29\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.288992 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.288914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2c84e9-4fda-4619-a86a-36c809b14446-serving-cert\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.288992 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.288963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9e2f0c20-f27f-4dee-a225-e224c0239ba4-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.289089 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.288995 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05104d65-51f0-434e-945d-2714c95daadd-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289246 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cd7h\" (UniqueName: \"kubernetes.io/projected/82558a12-6f14-43c9-848d-c3d942592e1d-kube-api-access-8cd7h\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.289330 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.289330 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05104d65-51f0-434e-945d-2714c95daadd-tmp\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289330 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a2c84e9-4fda-4619-a86a-36c809b14446-config\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.289484 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.289484 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.289374 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:47.289484 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.289445 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls podName:9e2f0c20-f27f-4dee-a225-e224c0239ba4 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:47.789423895 +0000 UTC m=+123.616134295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-985g6" (UID: "9e2f0c20-f27f-4dee-a225-e224c0239ba4") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:47.289673 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05104d65-51f0-434e-945d-2714c95daadd-serving-cert\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289728 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289711 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05104d65-51f0-434e-945d-2714c95daadd-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w65cn\" (UniqueName: \"kubernetes.io/projected/9e2f0c20-f27f-4dee-a225-e224c0239ba4-kube-api-access-w65cn\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.289833 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsngd\" (UniqueName: \"kubernetes.io/projected/05104d65-51f0-434e-945d-2714c95daadd-kube-api-access-nsngd\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289995 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/05104d65-51f0-434e-945d-2714c95daadd-snapshots\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289995 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/9e2f0c20-f27f-4dee-a225-e224c0239ba4-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.289995 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289899 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/05104d65-51f0-434e-945d-2714c95daadd-tmp\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.289995 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.289929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05104d65-51f0-434e-945d-2714c95daadd-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.290333 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.290212 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:31:47.290333 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.290303 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls podName:82558a12-6f14-43c9-848d-c3d942592e1d nodeName:}" failed. No retries permitted until 2026-04-16 14:31:47.790284829 +0000 UTC m=+123.616995229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls") pod "cluster-samples-operator-667775844f-7gxz5" (UID: "82558a12-6f14-43c9-848d-c3d942592e1d") : secret "samples-operator-tls" not found Apr 16 14:31:47.290747 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.290691 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05104d65-51f0-434e-945d-2714c95daadd-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.290848 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.290832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/05104d65-51f0-434e-945d-2714c95daadd-snapshots\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.292417 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.292397 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7f7fd7cc4b-cbgrm"] Apr 16 14:31:47.293299 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.293279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05104d65-51f0-434e-945d-2714c95daadd-serving-cert\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.305820 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.305796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cd7h\" (UniqueName: \"kubernetes.io/projected/82558a12-6f14-43c9-848d-c3d942592e1d-kube-api-access-8cd7h\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.305942 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.305897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w65cn\" (UniqueName: \"kubernetes.io/projected/9e2f0c20-f27f-4dee-a225-e224c0239ba4-kube-api-access-w65cn\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.308548 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.308514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsngd\" (UniqueName: \"kubernetes.io/projected/05104d65-51f0-434e-945d-2714c95daadd-kube-api-access-nsngd\") pod \"insights-operator-5785d4fcdd-cnkw9\" (UID: \"05104d65-51f0-434e-945d-2714c95daadd\") " pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.390590 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kcn29\" (UniqueName: \"kubernetes.io/projected/1a2c84e9-4fda-4619-a86a-36c809b14446-kube-api-access-kcn29\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.390778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-default-certificate\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.390778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.390778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.390778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a2c84e9-4fda-4619-a86a-36c809b14446-config\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.390958 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390850 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-stats-auth\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.390958 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2c84e9-4fda-4619-a86a-36c809b14446-serving-cert\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.390958 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390918 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ee0bbf-24fd-4084-b536-8d53b20944b9-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.390958 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.390955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p4b6\" (UniqueName: \"kubernetes.io/projected/93dee602-e658-4746-861c-192789af4e9c-kube-api-access-4p4b6\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.391108 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.391009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ee0bbf-24fd-4084-b536-8d53b20944b9-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.391108 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.391034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8467\" (UniqueName: \"kubernetes.io/projected/d1ee0bbf-24fd-4084-b536-8d53b20944b9-kube-api-access-t8467\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.391749 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.391726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a2c84e9-4fda-4619-a86a-36c809b14446-config\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.393297 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.393272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2c84e9-4fda-4619-a86a-36c809b14446-serving-cert\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.400176 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.400151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcn29\" (UniqueName: \"kubernetes.io/projected/1a2c84e9-4fda-4619-a86a-36c809b14446-kube-api-access-kcn29\") pod \"service-ca-operator-69965bb79d-7bpgr\" (UID: \"1a2c84e9-4fda-4619-a86a-36c809b14446\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.427289 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.427253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" Apr 16 14:31:47.492080 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.492257 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492200 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-stats-auth\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.492257 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ee0bbf-24fd-4084-b536-8d53b20944b9-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.492384 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.492269 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:47.992242657 +0000 UTC m=+123.818953073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : configmap references non-existent config key: service-ca.crt Apr 16 14:31:47.492384 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4p4b6\" (UniqueName: \"kubernetes.io/projected/93dee602-e658-4746-861c-192789af4e9c-kube-api-access-4p4b6\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.492492 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ee0bbf-24fd-4084-b536-8d53b20944b9-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.492492 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8467\" (UniqueName: \"kubernetes.io/projected/d1ee0bbf-24fd-4084-b536-8d53b20944b9-kube-api-access-t8467\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.492613 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-default-certificate\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.492613 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.492713 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.492669 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:31:47.492771 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.492721 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:47.992703719 +0000 UTC m=+123.819414116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : secret "router-metrics-certs-default" not found Apr 16 14:31:47.492820 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.492801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1ee0bbf-24fd-4084-b536-8d53b20944b9-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.495010 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.494978 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1ee0bbf-24fd-4084-b536-8d53b20944b9-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.495147 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.495019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-stats-auth\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.495147 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.495135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-default-certificate\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.501933 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.501901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8467\" (UniqueName: \"kubernetes.io/projected/d1ee0bbf-24fd-4084-b536-8d53b20944b9-kube-api-access-t8467\") pod \"kube-storage-version-migrator-operator-756bb7d76f-d8v8k\" (UID: \"d1ee0bbf-24fd-4084-b536-8d53b20944b9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.502101 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.502058 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p4b6\" (UniqueName: \"kubernetes.io/projected/93dee602-e658-4746-861c-192789af4e9c-kube-api-access-4p4b6\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.506869 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.506844 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" Apr 16 14:31:47.546876 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.546839 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-cnkw9"] Apr 16 14:31:47.549886 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:31:47.549858 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05104d65_51f0_434e_945d_2714c95daadd.slice/crio-f69134f87002d72ffe7ca1886e2b5da162b4a5c581ee36407fe895e268f01ba7 WatchSource:0}: Error finding container f69134f87002d72ffe7ca1886e2b5da162b4a5c581ee36407fe895e268f01ba7: Status 404 returned error can't find the container with id f69134f87002d72ffe7ca1886e2b5da162b4a5c581ee36407fe895e268f01ba7 Apr 16 14:31:47.587828 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.587803 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" Apr 16 14:31:47.626176 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.626112 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr"] Apr 16 14:31:47.630589 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:31:47.630560 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a2c84e9_4fda_4619_a86a_36c809b14446.slice/crio-e0d7519d63941226929ed70fec825d5e53905180c594e5b85c18297df4fb891e WatchSource:0}: Error finding container e0d7519d63941226929ed70fec825d5e53905180c594e5b85c18297df4fb891e: Status 404 returned error can't find the container with id e0d7519d63941226929ed70fec825d5e53905180c594e5b85c18297df4fb891e Apr 16 14:31:47.706638 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.706603 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k"] Apr 16 14:31:47.709319 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:31:47.709291 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ee0bbf_24fd_4084_b536_8d53b20944b9.slice/crio-635aeffd69f506684bb10eab4dc42889ce92fe145fa122f12a6a3fdb4778e428 WatchSource:0}: Error finding container 635aeffd69f506684bb10eab4dc42889ce92fe145fa122f12a6a3fdb4778e428: Status 404 returned error can't find the container with id 635aeffd69f506684bb10eab4dc42889ce92fe145fa122f12a6a3fdb4778e428 Apr 16 14:31:47.795853 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.795820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:47.796020 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.795934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:47.796020 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.795971 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:47.796089 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.796025 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:31:47.796089 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.796042 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls podName:9e2f0c20-f27f-4dee-a225-e224c0239ba4 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:48.796025754 +0000 UTC m=+124.622736153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-985g6" (UID: "9e2f0c20-f27f-4dee-a225-e224c0239ba4") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:47.796089 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.796073 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls podName:82558a12-6f14-43c9-848d-c3d942592e1d nodeName:}" failed. No retries permitted until 2026-04-16 14:31:48.796066619 +0000 UTC m=+124.622777014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls") pod "cluster-samples-operator-667775844f-7gxz5" (UID: "82558a12-6f14-43c9-848d-c3d942592e1d") : secret "samples-operator-tls" not found Apr 16 14:31:47.998464 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.998364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.998464 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.998419 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:31:47.998708 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.998492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:48.998470971 +0000 UTC m=+124.825181366 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : secret "router-metrics-certs-default" not found Apr 16 14:31:47.998708 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:47.998425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:47.998708 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:47.998524 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:48.998508902 +0000 UTC m=+124.825219300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : configmap references non-existent config key: service-ca.crt Apr 16 14:31:48.208881 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:48.208823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" event={"ID":"d1ee0bbf-24fd-4084-b536-8d53b20944b9","Type":"ContainerStarted","Data":"635aeffd69f506684bb10eab4dc42889ce92fe145fa122f12a6a3fdb4778e428"} Apr 16 14:31:48.210123 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:48.210093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" event={"ID":"1a2c84e9-4fda-4619-a86a-36c809b14446","Type":"ContainerStarted","Data":"e0d7519d63941226929ed70fec825d5e53905180c594e5b85c18297df4fb891e"} Apr 16 14:31:48.211517 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:48.211473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" event={"ID":"05104d65-51f0-434e-945d-2714c95daadd","Type":"ContainerStarted","Data":"f69134f87002d72ffe7ca1886e2b5da162b4a5c581ee36407fe895e268f01ba7"} Apr 16 14:31:48.807145 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:48.806636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:48.807145 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:48.806803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:48.807145 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:48.806967 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:31:48.807145 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:48.806986 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:48.807145 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:48.807048 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls podName:82558a12-6f14-43c9-848d-c3d942592e1d nodeName:}" failed. No retries permitted until 2026-04-16 14:31:50.807027895 +0000 UTC m=+126.633738305 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls") pod "cluster-samples-operator-667775844f-7gxz5" (UID: "82558a12-6f14-43c9-848d-c3d942592e1d") : secret "samples-operator-tls" not found Apr 16 14:31:48.807145 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:48.807069 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls podName:9e2f0c20-f27f-4dee-a225-e224c0239ba4 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:50.807059382 +0000 UTC m=+126.633769778 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-985g6" (UID: "9e2f0c20-f27f-4dee-a225-e224c0239ba4") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:49.009726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:49.009680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:49.009903 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:49.009737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:49.009903 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:49.009838 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:31:49.009995 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:49.009912 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:51.009892956 +0000 UTC m=+126.836603364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : secret "router-metrics-certs-default" not found Apr 16 14:31:49.009995 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:49.009932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:51.009922134 +0000 UTC m=+126.836632543 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : configmap references non-existent config key: service-ca.crt Apr 16 14:31:50.827906 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:50.827862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:50.828320 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:50.827956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:50.828320 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:50.828038 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:31:50.828320 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:50.828089 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:50.828320 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:50.828123 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls podName:82558a12-6f14-43c9-848d-c3d942592e1d nodeName:}" failed. No retries permitted until 2026-04-16 14:31:54.828101695 +0000 UTC m=+130.654812095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls") pod "cluster-samples-operator-667775844f-7gxz5" (UID: "82558a12-6f14-43c9-848d-c3d942592e1d") : secret "samples-operator-tls" not found Apr 16 14:31:50.828320 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:50.828142 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls podName:9e2f0c20-f27f-4dee-a225-e224c0239ba4 nodeName:}" failed. No retries permitted until 2026-04-16 14:31:54.828133433 +0000 UTC m=+130.654843832 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-985g6" (UID: "9e2f0c20-f27f-4dee-a225-e224c0239ba4") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:51.030345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.030293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:51.030345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.030351 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:51.030698 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:51.030491 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:31:51.030698 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:51.030526 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:55.030507328 +0000 UTC m=+130.857217774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : configmap references non-existent config key: service-ca.crt Apr 16 14:31:51.030698 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:51.030579 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:31:55.030561715 +0000 UTC m=+130.857272115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : secret "router-metrics-certs-default" not found Apr 16 14:31:51.218978 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.218935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" event={"ID":"1a2c84e9-4fda-4619-a86a-36c809b14446","Type":"ContainerStarted","Data":"df277f6edc3268218d003789c83f5654e05d15a9d8ab55276722ce5196649c6a"} Apr 16 14:31:51.220476 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.220444 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" event={"ID":"05104d65-51f0-434e-945d-2714c95daadd","Type":"ContainerStarted","Data":"48018831df0279fa95aebed63289786cc06ffc8b44bc693b0c175be7d3805229"} Apr 16 14:31:51.221697 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.221671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" event={"ID":"d1ee0bbf-24fd-4084-b536-8d53b20944b9","Type":"ContainerStarted","Data":"5008b15899279ced1093d41c0818f6e655a8cd1e422d65c83f1b5f0ea91a3b3b"} Apr 16 14:31:51.238950 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.238874 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" podStartSLOduration=1.66159158 podStartE2EDuration="4.238859162s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:31:47.63232128 +0000 UTC m=+123.459031688" lastFinishedPulling="2026-04-16 14:31:50.209588858 +0000 UTC m=+126.036299270" observedRunningTime="2026-04-16 14:31:51.238147176 +0000 UTC m=+127.064857593" watchObservedRunningTime="2026-04-16 14:31:51.238859162 +0000 UTC m=+127.065569578" Apr 16 14:31:51.255050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.254991 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" podStartSLOduration=1.600305438 podStartE2EDuration="4.254972441s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:31:47.552192259 +0000 UTC m=+123.378902654" lastFinishedPulling="2026-04-16 14:31:50.206859258 +0000 UTC m=+126.033569657" observedRunningTime="2026-04-16 14:31:51.254751797 +0000 UTC m=+127.081462215" watchObservedRunningTime="2026-04-16 14:31:51.254972441 +0000 UTC m=+127.081682871" Apr 16 14:31:51.280363 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:51.280306 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" podStartSLOduration=1.7773567209999999 podStartE2EDuration="4.280287837s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:31:47.711043185 +0000 UTC m=+123.537753581" lastFinishedPulling="2026-04-16 14:31:50.213974302 +0000 UTC m=+126.040684697" observedRunningTime="2026-04-16 14:31:51.27979804 +0000 UTC m=+127.106508456" watchObservedRunningTime="2026-04-16 14:31:51.280287837 +0000 UTC m=+127.106998257" Apr 16 14:31:53.831508 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:53.831478 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-x7zbw_6079e191-6467-414e-9381-0ffd91e44ab4/dns-node-resolver/0.log" Apr 16 14:31:54.434703 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:54.434675 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-29nkz_1491249e-0f9d-4121-abb3-d99d4022d023/node-ca/0.log" Apr 16 14:31:54.563174 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:54.563121 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:31:54.563354 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:54.563274 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:31:54.563400 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:54.563365 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs podName:bfa07533-d734-4829-bde0-6c0327bd79a9 nodeName:}" failed. No retries permitted until 2026-04-16 14:33:56.563348892 +0000 UTC m=+252.390059291 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs") pod "network-metrics-daemon-kbtb7" (UID: "bfa07533-d734-4829-bde0-6c0327bd79a9") : secret "metrics-daemon-secret" not found Apr 16 14:31:54.865848 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:54.865811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:31:54.866223 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:54.865900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:31:54.866223 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:54.865967 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 14:31:54.866223 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:54.866033 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls podName:82558a12-6f14-43c9-848d-c3d942592e1d nodeName:}" failed. No retries permitted until 2026-04-16 14:32:02.866014621 +0000 UTC m=+138.692725025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls") pod "cluster-samples-operator-667775844f-7gxz5" (UID: "82558a12-6f14-43c9-848d-c3d942592e1d") : secret "samples-operator-tls" not found Apr 16 14:31:54.866223 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:54.866040 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:54.866223 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:54.866105 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls podName:9e2f0c20-f27f-4dee-a225-e224c0239ba4 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:02.866091136 +0000 UTC m=+138.692801530 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-985g6" (UID: "9e2f0c20-f27f-4dee-a225-e224c0239ba4") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:31:55.067512 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:55.067471 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:55.067512 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:31:55.067512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:31:55.067714 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:55.067642 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 14:31:55.067714 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:55.067697 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:32:03.067680215 +0000 UTC m=+138.894390614 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : configmap references non-existent config key: service-ca.crt Apr 16 14:31:55.067714 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:31:55.067715 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:32:03.067707922 +0000 UTC m=+138.894418317 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : secret "router-metrics-certs-default" not found Apr 16 14:32:02.932968 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:02.932922 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:32:02.933362 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:02.933023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:32:02.933362 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:02.933070 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:32:02.933362 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:02.933200 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls podName:9e2f0c20-f27f-4dee-a225-e224c0239ba4 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:18.933174106 +0000 UTC m=+154.759884501 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-985g6" (UID: "9e2f0c20-f27f-4dee-a225-e224c0239ba4") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:32:02.936015 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:02.935990 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/82558a12-6f14-43c9-848d-c3d942592e1d-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-7gxz5\" (UID: \"82558a12-6f14-43c9-848d-c3d942592e1d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:32:02.996363 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:02.996327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" Apr 16 14:32:03.112629 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:03.112607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5"] Apr 16 14:32:03.135528 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:03.135501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:03.135669 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:03.135555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:03.135773 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:03.135758 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle podName:93dee602-e658-4746-861c-192789af4e9c nodeName:}" failed. No retries permitted until 2026-04-16 14:32:19.135740469 +0000 UTC m=+154.962450864 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle") pod "router-default-7f7fd7cc4b-cbgrm" (UID: "93dee602-e658-4746-861c-192789af4e9c") : configmap references non-existent config key: service-ca.crt Apr 16 14:32:03.137790 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:03.137767 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93dee602-e658-4746-861c-192789af4e9c-metrics-certs\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:03.254423 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:03.254340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" event={"ID":"82558a12-6f14-43c9-848d-c3d942592e1d","Type":"ContainerStarted","Data":"0210d32088b61682c32403ae6d3356ed23617c8b3debc6cf3da60c47b130b6ed"} Apr 16 14:32:05.261651 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:05.261608 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" event={"ID":"82558a12-6f14-43c9-848d-c3d942592e1d","Type":"ContainerStarted","Data":"f656022ba1c1db2a8e17947547d7870df670ed009e85ecce6b38f194a106a41b"} Apr 16 14:32:05.261651 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:05.261652 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" event={"ID":"82558a12-6f14-43c9-848d-c3d942592e1d","Type":"ContainerStarted","Data":"220c3029df628f31c5fd50497f7274e1e9e2ef1729b090677ce0ac640e5dff01"} Apr 16 14:32:05.280150 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:05.280099 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-7gxz5" podStartSLOduration=16.671440587 podStartE2EDuration="18.280080677s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:32:03.157004861 +0000 UTC m=+138.983715257" lastFinishedPulling="2026-04-16 14:32:04.76564495 +0000 UTC m=+140.592355347" observedRunningTime="2026-04-16 14:32:05.278929214 +0000 UTC m=+141.105639643" watchObservedRunningTime="2026-04-16 14:32:05.280080677 +0000 UTC m=+141.106791128" Apr 16 14:32:15.518855 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.518815 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8fb7x"] Apr 16 14:32:15.520919 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.520902 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.528783 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.528751 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:32:15.529035 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.529004 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:32:15.529221 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.529207 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k65jf\"" Apr 16 14:32:15.531644 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.531623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7759f9cdf-skkws"] Apr 16 14:32:15.533565 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.533550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.537982 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.537963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-bound-sa-token\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538074 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.537993 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eb1c882f-bfac-4890-a59f-3f3d81e30081-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.538074 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538014 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b79gp\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-kube-api-access-b79gp\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eb1c882f-bfac-4890-a59f-3f3d81e30081-crio-socket\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.538188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-registry-tls\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb1c882f-bfac-4890-a59f-3f3d81e30081-data-volume\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.538343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/426b5a99-d698-419c-b42f-63c90eadfa2b-ca-trust-extracted\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538246 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/426b5a99-d698-419c-b42f-63c90eadfa2b-installation-pull-secrets\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/426b5a99-d698-419c-b42f-63c90eadfa2b-image-registry-private-configuration\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/426b5a99-d698-419c-b42f-63c90eadfa2b-registry-certificates\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538558 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/426b5a99-d698-419c-b42f-63c90eadfa2b-trusted-ca\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.538558 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eb1c882f-bfac-4890-a59f-3f3d81e30081-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.538558 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.538417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4hq5\" (UniqueName: \"kubernetes.io/projected/eb1c882f-bfac-4890-a59f-3f3d81e30081-kube-api-access-f4hq5\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.540127 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.540111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:32:15.543464 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.543444 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8fb7x"] Apr 16 14:32:15.545716 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.545700 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r52pv\"" Apr 16 14:32:15.547596 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.547580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:32:15.557416 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.557394 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:32:15.559659 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.559642 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:32:15.564165 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.564144 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7759f9cdf-skkws"] Apr 16 14:32:15.639507 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/426b5a99-d698-419c-b42f-63c90eadfa2b-image-registry-private-configuration\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.639695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/426b5a99-d698-419c-b42f-63c90eadfa2b-registry-certificates\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.639695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/426b5a99-d698-419c-b42f-63c90eadfa2b-trusted-ca\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.639695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eb1c882f-bfac-4890-a59f-3f3d81e30081-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.639695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4hq5\" (UniqueName: \"kubernetes.io/projected/eb1c882f-bfac-4890-a59f-3f3d81e30081-kube-api-access-f4hq5\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.639695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-bound-sa-token\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.639883 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eb1c882f-bfac-4890-a59f-3f3d81e30081-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.639883 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b79gp\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-kube-api-access-b79gp\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.639883 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eb1c882f-bfac-4890-a59f-3f3d81e30081-crio-socket\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.640050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639906 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-registry-tls\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.640050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb1c882f-bfac-4890-a59f-3f3d81e30081-data-volume\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.640050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.639971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/426b5a99-d698-419c-b42f-63c90eadfa2b-ca-trust-extracted\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.640050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.640002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/426b5a99-d698-419c-b42f-63c90eadfa2b-installation-pull-secrets\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.640273 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.640155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/eb1c882f-bfac-4890-a59f-3f3d81e30081-crio-socket\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.640273 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.640220 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/eb1c882f-bfac-4890-a59f-3f3d81e30081-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.640662 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.640470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/eb1c882f-bfac-4890-a59f-3f3d81e30081-data-volume\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.640891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.640869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/426b5a99-d698-419c-b42f-63c90eadfa2b-registry-certificates\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.641230 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.641206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/426b5a99-d698-419c-b42f-63c90eadfa2b-ca-trust-extracted\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.641344 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.641320 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/426b5a99-d698-419c-b42f-63c90eadfa2b-trusted-ca\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.642449 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.642431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-registry-tls\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.642608 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.642591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/426b5a99-d698-419c-b42f-63c90eadfa2b-installation-pull-secrets\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.642672 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.642627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/eb1c882f-bfac-4890-a59f-3f3d81e30081-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.642708 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.642694 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/426b5a99-d698-419c-b42f-63c90eadfa2b-image-registry-private-configuration\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.654258 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.654227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-bound-sa-token\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.654387 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.654366 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4hq5\" (UniqueName: \"kubernetes.io/projected/eb1c882f-bfac-4890-a59f-3f3d81e30081-kube-api-access-f4hq5\") pod \"insights-runtime-extractor-8fb7x\" (UID: \"eb1c882f-bfac-4890-a59f-3f3d81e30081\") " pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.654615 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.654598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b79gp\" (UniqueName: \"kubernetes.io/projected/426b5a99-d698-419c-b42f-63c90eadfa2b-kube-api-access-b79gp\") pod \"image-registry-7759f9cdf-skkws\" (UID: \"426b5a99-d698-419c-b42f-63c90eadfa2b\") " pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.829777 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.829695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8fb7x" Apr 16 14:32:15.841085 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.841060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:15.959147 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.959024 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8fb7x"] Apr 16 14:32:15.961596 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:15.961560 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb1c882f_bfac_4890_a59f_3f3d81e30081.slice/crio-5c5a6c9a4e4d6334dd74324e1a6704abd1bb971a8b29e7ce4fe192287c91eac4 WatchSource:0}: Error finding container 5c5a6c9a4e4d6334dd74324e1a6704abd1bb971a8b29e7ce4fe192287c91eac4: Status 404 returned error can't find the container with id 5c5a6c9a4e4d6334dd74324e1a6704abd1bb971a8b29e7ce4fe192287c91eac4 Apr 16 14:32:15.974844 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:15.974823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7759f9cdf-skkws"] Apr 16 14:32:15.977571 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:15.977519 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod426b5a99_d698_419c_b42f_63c90eadfa2b.slice/crio-88493ac61b1284c987542ff205eb9053cea6a38706ffcdd3baeb9d7a2ae701a4 WatchSource:0}: Error finding container 88493ac61b1284c987542ff205eb9053cea6a38706ffcdd3baeb9d7a2ae701a4: Status 404 returned error can't find the container with id 88493ac61b1284c987542ff205eb9053cea6a38706ffcdd3baeb9d7a2ae701a4 Apr 16 14:32:16.290028 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:16.289982 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" event={"ID":"426b5a99-d698-419c-b42f-63c90eadfa2b","Type":"ContainerStarted","Data":"a6220bb10607bb46a123545515223335c3adbf5fe87fbcf9e0d4cb902d828fec"} Apr 16 14:32:16.290221 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:16.290034 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" event={"ID":"426b5a99-d698-419c-b42f-63c90eadfa2b","Type":"ContainerStarted","Data":"88493ac61b1284c987542ff205eb9053cea6a38706ffcdd3baeb9d7a2ae701a4"} Apr 16 14:32:16.290221 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:16.290095 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:16.291142 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:16.291118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8fb7x" event={"ID":"eb1c882f-bfac-4890-a59f-3f3d81e30081","Type":"ContainerStarted","Data":"0bf1a87f191915c5b38bf75f31dceea52c00a4cc189d808d7fa92663d5e8ff6a"} Apr 16 14:32:16.291142 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:16.291144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8fb7x" event={"ID":"eb1c882f-bfac-4890-a59f-3f3d81e30081","Type":"ContainerStarted","Data":"5c5a6c9a4e4d6334dd74324e1a6704abd1bb971a8b29e7ce4fe192287c91eac4"} Apr 16 14:32:16.310335 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:16.310288 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" podStartSLOduration=1.310272186 podStartE2EDuration="1.310272186s" podCreationTimestamp="2026-04-16 14:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:32:16.30972028 +0000 UTC m=+152.136430698" watchObservedRunningTime="2026-04-16 14:32:16.310272186 +0000 UTC m=+152.136982625" Apr 16 14:32:17.296454 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:17.296409 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8fb7x" event={"ID":"eb1c882f-bfac-4890-a59f-3f3d81e30081","Type":"ContainerStarted","Data":"3ebf783987ef5fad79652c592443e8b2bdba9bc296220d145967bd7b0871576f"} Apr 16 14:32:18.300215 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:18.300174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8fb7x" event={"ID":"eb1c882f-bfac-4890-a59f-3f3d81e30081","Type":"ContainerStarted","Data":"5790f05b325fd2fc2c41e959a6165e57dc1b32e0ce349db84292412fe5f06e08"} Apr 16 14:32:18.319371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:18.319322 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8fb7x" podStartSLOduration=1.5606802370000001 podStartE2EDuration="3.319309398s" podCreationTimestamp="2026-04-16 14:32:15 +0000 UTC" firstStartedPulling="2026-04-16 14:32:16.018151885 +0000 UTC m=+151.844862286" lastFinishedPulling="2026-04-16 14:32:17.776781033 +0000 UTC m=+153.603491447" observedRunningTime="2026-04-16 14:32:18.318746809 +0000 UTC m=+154.145457225" watchObservedRunningTime="2026-04-16 14:32:18.319309398 +0000 UTC m=+154.146019815" Apr 16 14:32:18.969101 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:18.969058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:32:18.971437 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:18.971402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/9e2f0c20-f27f-4dee-a225-e224c0239ba4-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-985g6\" (UID: \"9e2f0c20-f27f-4dee-a225-e224c0239ba4\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:32:19.170733 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:19.170678 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:19.171261 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:19.171239 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93dee602-e658-4746-861c-192789af4e9c-service-ca-bundle\") pod \"router-default-7f7fd7cc4b-cbgrm\" (UID: \"93dee602-e658-4746-861c-192789af4e9c\") " pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:19.191698 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:19.191664 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" Apr 16 14:32:19.312068 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:19.311966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6"] Apr 16 14:32:19.314406 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:19.314380 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e2f0c20_f27f_4dee_a225_e224c0239ba4.slice/crio-09c40a9fead3f18a3fd16c4bf3314120cd0b24c063f7900acb6fb6b7167a0d5d WatchSource:0}: Error finding container 09c40a9fead3f18a3fd16c4bf3314120cd0b24c063f7900acb6fb6b7167a0d5d: Status 404 returned error can't find the container with id 09c40a9fead3f18a3fd16c4bf3314120cd0b24c063f7900acb6fb6b7167a0d5d Apr 16 14:32:19.395281 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:19.395238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:19.529331 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:19.529294 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7f7fd7cc4b-cbgrm"] Apr 16 14:32:19.531729 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:19.531707 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93dee602_e658_4746_861c_192789af4e9c.slice/crio-e16a7625a6e2d4b4db5094c34483703076ea5ed3ddf840cc811cfb3a0ae299ed WatchSource:0}: Error finding container e16a7625a6e2d4b4db5094c34483703076ea5ed3ddf840cc811cfb3a0ae299ed: Status 404 returned error can't find the container with id e16a7625a6e2d4b4db5094c34483703076ea5ed3ddf840cc811cfb3a0ae299ed Apr 16 14:32:20.308445 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:20.308406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" event={"ID":"93dee602-e658-4746-861c-192789af4e9c","Type":"ContainerStarted","Data":"f9efba9c62914dc8282cc33370700d774a2845a7d3330220494882985c391bb1"} Apr 16 14:32:20.308445 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:20.308452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" event={"ID":"93dee602-e658-4746-861c-192789af4e9c","Type":"ContainerStarted","Data":"e16a7625a6e2d4b4db5094c34483703076ea5ed3ddf840cc811cfb3a0ae299ed"} Apr 16 14:32:20.309655 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:20.309624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" event={"ID":"9e2f0c20-f27f-4dee-a225-e224c0239ba4","Type":"ContainerStarted","Data":"09c40a9fead3f18a3fd16c4bf3314120cd0b24c063f7900acb6fb6b7167a0d5d"} Apr 16 14:32:20.329987 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:20.329934 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" podStartSLOduration=33.329919374 podStartE2EDuration="33.329919374s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:32:20.32927779 +0000 UTC m=+156.155988210" watchObservedRunningTime="2026-04-16 14:32:20.329919374 +0000 UTC m=+156.156629791" Apr 16 14:32:20.395861 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:20.395823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:20.398677 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:20.398643 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:20.529514 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:20.529467 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-99g2n" podUID="c18fae62-a7e8-473e-a375-928abb494bf2" Apr 16 14:32:20.549805 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:20.549767 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2tkjc" podUID="8e9a80d4-54ef-459f-b21f-e013f853f63d" Apr 16 14:32:20.787152 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:20.787106 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kbtb7" podUID="bfa07533-d734-4829-bde0-6c0327bd79a9" Apr 16 14:32:21.235067 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.235033 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2"] Apr 16 14:32:21.237543 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.237512 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:21.239932 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.239905 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 14:32:21.240049 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.239992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-g5599\"" Apr 16 14:32:21.246017 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.245997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2"] Apr 16 14:32:21.290331 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.290293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e883d262-4ffe-42cc-847d-aacbe36bf9c5-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4w2j2\" (UID: \"e883d262-4ffe-42cc-847d-aacbe36bf9c5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:21.313681 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.313638 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" event={"ID":"9e2f0c20-f27f-4dee-a225-e224c0239ba4","Type":"ContainerStarted","Data":"aad04dbf6c4aa23e2da1cb4f72346d1ed4112006cc6ea8083e0064ecb0f510b4"} Apr 16 14:32:21.313681 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.313662 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:32:21.314011 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.313990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-99g2n" Apr 16 14:32:21.314127 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.314076 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:21.315292 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.315277 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7f7fd7cc4b-cbgrm" Apr 16 14:32:21.331596 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.331556 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-985g6" podStartSLOduration=32.930611636 podStartE2EDuration="34.331527732s" podCreationTimestamp="2026-04-16 14:31:47 +0000 UTC" firstStartedPulling="2026-04-16 14:32:19.316158618 +0000 UTC m=+155.142869028" lastFinishedPulling="2026-04-16 14:32:20.717074726 +0000 UTC m=+156.543785124" observedRunningTime="2026-04-16 14:32:21.330695439 +0000 UTC m=+157.157405855" watchObservedRunningTime="2026-04-16 14:32:21.331527732 +0000 UTC m=+157.158238169" Apr 16 14:32:21.392033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.391790 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e883d262-4ffe-42cc-847d-aacbe36bf9c5-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4w2j2\" (UID: \"e883d262-4ffe-42cc-847d-aacbe36bf9c5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:21.392033 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:21.391930 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:32:21.392033 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:21.392016 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e883d262-4ffe-42cc-847d-aacbe36bf9c5-tls-certificates podName:e883d262-4ffe-42cc-847d-aacbe36bf9c5 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:21.891993384 +0000 UTC m=+157.718703783 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/e883d262-4ffe-42cc-847d-aacbe36bf9c5-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-4w2j2" (UID: "e883d262-4ffe-42cc-847d-aacbe36bf9c5") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 14:32:21.897010 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.896966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e883d262-4ffe-42cc-847d-aacbe36bf9c5-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4w2j2\" (UID: \"e883d262-4ffe-42cc-847d-aacbe36bf9c5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:21.899381 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:21.899343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/e883d262-4ffe-42cc-847d-aacbe36bf9c5-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-4w2j2\" (UID: \"e883d262-4ffe-42cc-847d-aacbe36bf9c5\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:22.147228 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:22.147144 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:22.259233 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:22.259196 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2"] Apr 16 14:32:22.262196 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:22.262168 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode883d262_4ffe_42cc_847d_aacbe36bf9c5.slice/crio-459179c33ffea043bdc1120a837f2b2c9abc474a60a25926c6c58dfbf41a205f WatchSource:0}: Error finding container 459179c33ffea043bdc1120a837f2b2c9abc474a60a25926c6c58dfbf41a205f: Status 404 returned error can't find the container with id 459179c33ffea043bdc1120a837f2b2c9abc474a60a25926c6c58dfbf41a205f Apr 16 14:32:22.317512 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:22.317472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" event={"ID":"e883d262-4ffe-42cc-847d-aacbe36bf9c5","Type":"ContainerStarted","Data":"459179c33ffea043bdc1120a837f2b2c9abc474a60a25926c6c58dfbf41a205f"} Apr 16 14:32:23.321471 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.321430 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" event={"ID":"e883d262-4ffe-42cc-847d-aacbe36bf9c5","Type":"ContainerStarted","Data":"3c36d8da2db9fe6fd2ecc9c9fb0d3e777f1e8780d7656b0a0bcacd5917f7a7ad"} Apr 16 14:32:23.321916 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.321731 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:23.326815 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.326791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" Apr 16 14:32:23.340188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.340144 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-4w2j2" podStartSLOduration=1.43991462 podStartE2EDuration="2.340130706s" podCreationTimestamp="2026-04-16 14:32:21 +0000 UTC" firstStartedPulling="2026-04-16 14:32:22.264075251 +0000 UTC m=+158.090785659" lastFinishedPulling="2026-04-16 14:32:23.164291336 +0000 UTC m=+158.991001745" observedRunningTime="2026-04-16 14:32:23.33887811 +0000 UTC m=+159.165588528" watchObservedRunningTime="2026-04-16 14:32:23.340130706 +0000 UTC m=+159.166841123" Apr 16 14:32:23.463150 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.463066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5655bf44bb-z5sn5"] Apr 16 14:32:23.467860 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.467837 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.474424 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474395 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 14:32:23.474622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:32:23.474622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474608 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:32:23.474760 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474686 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 14:32:23.474891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474873 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 14:32:23.474963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474914 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 14:32:23.474963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.474918 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-5hq8m\"" Apr 16 14:32:23.475064 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.475000 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 14:32:23.479578 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.479557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5655bf44bb-z5sn5"] Apr 16 14:32:23.510084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.510044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-oauth-config\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.510084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.510087 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-config\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.510260 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.510109 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-service-ca\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.510260 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.510198 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s774k\" (UniqueName: \"kubernetes.io/projected/b2ad56c7-d37f-486d-bc07-683c2aa2437e-kube-api-access-s774k\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.510375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.510265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-serving-cert\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.510375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.510295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-oauth-serving-cert\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.611365 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.611329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s774k\" (UniqueName: \"kubernetes.io/projected/b2ad56c7-d37f-486d-bc07-683c2aa2437e-kube-api-access-s774k\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.611542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.611388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-serving-cert\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.611542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.611410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-oauth-serving-cert\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.611542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.611447 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-oauth-config\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.611542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.611468 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-config\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.611542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.611483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-service-ca\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.612085 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.612057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-oauth-serving-cert\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.612237 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.612216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-config\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.612311 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.612264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-service-ca\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.613876 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.613846 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-serving-cert\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.614024 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.614005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-oauth-config\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.622742 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.622716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s774k\" (UniqueName: \"kubernetes.io/projected/b2ad56c7-d37f-486d-bc07-683c2aa2437e-kube-api-access-s774k\") pod \"console-5655bf44bb-z5sn5\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.777337 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.777242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:23.892135 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:23.892105 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5655bf44bb-z5sn5"] Apr 16 14:32:23.907459 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:23.907422 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2ad56c7_d37f_486d_bc07_683c2aa2437e.slice/crio-fc6b21151fb147e06df6d0b74c6c71f50a96b432f5405373d27f97932decbf03 WatchSource:0}: Error finding container fc6b21151fb147e06df6d0b74c6c71f50a96b432f5405373d27f97932decbf03: Status 404 returned error can't find the container with id fc6b21151fb147e06df6d0b74c6c71f50a96b432f5405373d27f97932decbf03 Apr 16 14:32:24.308609 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.308575 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-swf4k"] Apr 16 14:32:24.354697 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.354663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-swf4k"] Apr 16 14:32:24.354697 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.354695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5655bf44bb-z5sn5" event={"ID":"b2ad56c7-d37f-486d-bc07-683c2aa2437e","Type":"ContainerStarted","Data":"fc6b21151fb147e06df6d0b74c6c71f50a96b432f5405373d27f97932decbf03"} Apr 16 14:32:24.355080 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.354919 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.358668 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.358645 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:32:24.358668 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.358665 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:32:24.358853 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.358685 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:32:24.358853 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.358710 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9b55q\"" Apr 16 14:32:24.418937 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.418895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbcmd\" (UniqueName: \"kubernetes.io/projected/4f1aa63b-175e-4630-a754-83cfe1a1942b-kube-api-access-pbcmd\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.419120 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.419032 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.419184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.419143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.419233 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.419180 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f1aa63b-175e-4630-a754-83cfe1a1942b-metrics-client-ca\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.520376 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.520338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.520573 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.520423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.520573 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.520441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f1aa63b-175e-4630-a754-83cfe1a1942b-metrics-client-ca\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.520573 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.520492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbcmd\" (UniqueName: \"kubernetes.io/projected/4f1aa63b-175e-4630-a754-83cfe1a1942b-kube-api-access-pbcmd\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.520750 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:24.520575 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 14:32:24.520750 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:24.520653 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls podName:4f1aa63b-175e-4630-a754-83cfe1a1942b nodeName:}" failed. No retries permitted until 2026-04-16 14:32:25.020636247 +0000 UTC m=+160.847346642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls") pod "prometheus-operator-78f957474d-swf4k" (UID: "4f1aa63b-175e-4630-a754-83cfe1a1942b") : secret "prometheus-operator-tls" not found Apr 16 14:32:24.521139 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.521119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f1aa63b-175e-4630-a754-83cfe1a1942b-metrics-client-ca\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.522791 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.522770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:24.530161 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:24.530139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbcmd\" (UniqueName: \"kubernetes.io/projected/4f1aa63b-175e-4630-a754-83cfe1a1942b-kube-api-access-pbcmd\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:25.023327 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.023287 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:25.023518 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:25.023429 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 14:32:25.023518 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:25.023493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls podName:4f1aa63b-175e-4630-a754-83cfe1a1942b nodeName:}" failed. No retries permitted until 2026-04-16 14:32:26.023477851 +0000 UTC m=+161.850188245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls") pod "prometheus-operator-78f957474d-swf4k" (UID: "4f1aa63b-175e-4630-a754-83cfe1a1942b") : secret "prometheus-operator-tls" not found Apr 16 14:32:25.527542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.527501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:32:25.528127 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.527614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:32:25.530348 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.530324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c18fae62-a7e8-473e-a375-928abb494bf2-metrics-tls\") pod \"dns-default-99g2n\" (UID: \"c18fae62-a7e8-473e-a375-928abb494bf2\") " pod="openshift-dns/dns-default-99g2n" Apr 16 14:32:25.530465 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.530444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e9a80d4-54ef-459f-b21f-e013f853f63d-cert\") pod \"ingress-canary-2tkjc\" (UID: \"8e9a80d4-54ef-459f-b21f-e013f853f63d\") " pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:32:25.816900 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.816817 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zpkvj\"" Apr 16 14:32:25.817696 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.817671 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dqsrn\"" Apr 16 14:32:25.824773 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.824748 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-99g2n" Apr 16 14:32:25.824773 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.824768 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2tkjc" Apr 16 14:32:25.968587 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.968552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-99g2n"] Apr 16 14:32:25.973869 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:25.973828 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc18fae62_a7e8_473e_a375_928abb494bf2.slice/crio-d32f2e6b2f694a6dfce3b64f81f9c215906bfa8fd2f76e1e2fe4f5406292891d WatchSource:0}: Error finding container d32f2e6b2f694a6dfce3b64f81f9c215906bfa8fd2f76e1e2fe4f5406292891d: Status 404 returned error can't find the container with id d32f2e6b2f694a6dfce3b64f81f9c215906bfa8fd2f76e1e2fe4f5406292891d Apr 16 14:32:25.989871 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:25.989835 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2tkjc"] Apr 16 14:32:25.992724 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:25.992693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e9a80d4_54ef_459f_b21f_e013f853f63d.slice/crio-b052dd3308d6c387170a4d13c3715c96a8b47d6219ad24f619ae6e3e8a45fafb WatchSource:0}: Error finding container b052dd3308d6c387170a4d13c3715c96a8b47d6219ad24f619ae6e3e8a45fafb: Status 404 returned error can't find the container with id b052dd3308d6c387170a4d13c3715c96a8b47d6219ad24f619ae6e3e8a45fafb Apr 16 14:32:26.033471 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:26.033438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:26.036699 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:26.036669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f1aa63b-175e-4630-a754-83cfe1a1942b-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-swf4k\" (UID: \"4f1aa63b-175e-4630-a754-83cfe1a1942b\") " pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:26.164588 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:26.164551 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" Apr 16 14:32:26.289986 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:26.289954 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-swf4k"] Apr 16 14:32:26.332410 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:26.332371 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2tkjc" event={"ID":"8e9a80d4-54ef-459f-b21f-e013f853f63d","Type":"ContainerStarted","Data":"b052dd3308d6c387170a4d13c3715c96a8b47d6219ad24f619ae6e3e8a45fafb"} Apr 16 14:32:26.333514 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:26.333482 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-99g2n" event={"ID":"c18fae62-a7e8-473e-a375-928abb494bf2","Type":"ContainerStarted","Data":"d32f2e6b2f694a6dfce3b64f81f9c215906bfa8fd2f76e1e2fe4f5406292891d"} Apr 16 14:32:27.224107 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:27.224062 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f1aa63b_175e_4630_a754_83cfe1a1942b.slice/crio-8fde1aea6e12bf89e3442da1ebdef90595fe56d157dbfd94bc3b3dcc24214880 WatchSource:0}: Error finding container 8fde1aea6e12bf89e3442da1ebdef90595fe56d157dbfd94bc3b3dcc24214880: Status 404 returned error can't find the container with id 8fde1aea6e12bf89e3442da1ebdef90595fe56d157dbfd94bc3b3dcc24214880 Apr 16 14:32:27.338407 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:27.338331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" event={"ID":"4f1aa63b-175e-4630-a754-83cfe1a1942b","Type":"ContainerStarted","Data":"8fde1aea6e12bf89e3442da1ebdef90595fe56d157dbfd94bc3b3dcc24214880"} Apr 16 14:32:28.346008 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:28.345970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5655bf44bb-z5sn5" event={"ID":"b2ad56c7-d37f-486d-bc07-683c2aa2437e","Type":"ContainerStarted","Data":"48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4"} Apr 16 14:32:28.368920 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:28.368870 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5655bf44bb-z5sn5" podStartSLOduration=2.00365776 podStartE2EDuration="5.36885388s" podCreationTimestamp="2026-04-16 14:32:23 +0000 UTC" firstStartedPulling="2026-04-16 14:32:23.90928231 +0000 UTC m=+159.735992706" lastFinishedPulling="2026-04-16 14:32:27.274478428 +0000 UTC m=+163.101188826" observedRunningTime="2026-04-16 14:32:28.367160134 +0000 UTC m=+164.193870552" watchObservedRunningTime="2026-04-16 14:32:28.36885388 +0000 UTC m=+164.195564298" Apr 16 14:32:29.358133 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:29.358093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2tkjc" event={"ID":"8e9a80d4-54ef-459f-b21f-e013f853f63d","Type":"ContainerStarted","Data":"3c055f569ae629488f337d9c40bd5323da0257365a497f9740134f1edc7746fd"} Apr 16 14:32:29.362091 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:29.361265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-99g2n" event={"ID":"c18fae62-a7e8-473e-a375-928abb494bf2","Type":"ContainerStarted","Data":"33e10094ca8d120b0e3351c0f7afccd650dcfb4f3a726c19b1d543c4f7fe79ad"} Apr 16 14:32:29.365474 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:29.365083 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" event={"ID":"4f1aa63b-175e-4630-a754-83cfe1a1942b","Type":"ContainerStarted","Data":"53a8a6fb8771d47061d6ef0e43770f57428b7f583c84821b04030a427ae86e49"} Apr 16 14:32:29.365474 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:29.365115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" event={"ID":"4f1aa63b-175e-4630-a754-83cfe1a1942b","Type":"ContainerStarted","Data":"3672c4ef0b1e802ef8ede60c03c5aa21279aea3725727edaa7faad16d7d34f2a"} Apr 16 14:32:29.374137 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:29.374087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2tkjc" podStartSLOduration=129.26919792 podStartE2EDuration="2m12.374068565s" podCreationTimestamp="2026-04-16 14:30:17 +0000 UTC" firstStartedPulling="2026-04-16 14:32:25.994990572 +0000 UTC m=+161.821700982" lastFinishedPulling="2026-04-16 14:32:29.099861215 +0000 UTC m=+164.926571627" observedRunningTime="2026-04-16 14:32:29.373491131 +0000 UTC m=+165.200201548" watchObservedRunningTime="2026-04-16 14:32:29.374068565 +0000 UTC m=+165.200778984" Apr 16 14:32:29.390240 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:29.390188 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-swf4k" podStartSLOduration=3.5180337550000003 podStartE2EDuration="5.3901693s" podCreationTimestamp="2026-04-16 14:32:24 +0000 UTC" firstStartedPulling="2026-04-16 14:32:27.227724913 +0000 UTC m=+163.054435322" lastFinishedPulling="2026-04-16 14:32:29.099860472 +0000 UTC m=+164.926570867" observedRunningTime="2026-04-16 14:32:29.389205406 +0000 UTC m=+165.215915823" watchObservedRunningTime="2026-04-16 14:32:29.3901693 +0000 UTC m=+165.216879717" Apr 16 14:32:30.369754 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:30.369713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-99g2n" event={"ID":"c18fae62-a7e8-473e-a375-928abb494bf2","Type":"ContainerStarted","Data":"111a9b6a89a17e238c4f362fb4e8ab0058017fb7868c937244305ac8736f4d33"} Apr 16 14:32:30.388920 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:30.388873 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-99g2n" podStartSLOduration=130.267781235 podStartE2EDuration="2m13.3888592s" podCreationTimestamp="2026-04-16 14:30:17 +0000 UTC" firstStartedPulling="2026-04-16 14:32:25.976092346 +0000 UTC m=+161.802802740" lastFinishedPulling="2026-04-16 14:32:29.0971703 +0000 UTC m=+164.923880705" observedRunningTime="2026-04-16 14:32:30.388094279 +0000 UTC m=+166.214804697" watchObservedRunningTime="2026-04-16 14:32:30.3888592 +0000 UTC m=+166.215569617" Apr 16 14:32:31.372630 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.372592 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-99g2n" Apr 16 14:32:31.673852 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.673772 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8"] Apr 16 14:32:31.677680 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.677665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.680361 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.680338 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:32:31.680489 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.680454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:32:31.680905 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.680888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-rzz5v\"" Apr 16 14:32:31.694141 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.694119 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8"] Apr 16 14:32:31.717836 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.717810 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cmxww"] Apr 16 14:32:31.722197 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.722174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.725375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.725352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:32:31.725489 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.725357 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-6npkq\"" Apr 16 14:32:31.725489 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.725400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:32:31.725901 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.725887 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:32:31.777468 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.777436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:32:31.782025 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/053bb478-a29a-4c10-8bab-0896952ba633-metrics-client-ca\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.782084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-tls\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782213 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-wtmp\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782213 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782213 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782163 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q65jm\" (UniqueName: \"kubernetes.io/projected/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-kube-api-access-q65jm\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.782300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-root\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-textfile\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/053bb478-a29a-4c10-8bab-0896952ba633-kube-api-access-wxrbm\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-sys\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.782441 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.782441 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.782380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.883421 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/053bb478-a29a-4c10-8bab-0896952ba633-kube-api-access-wxrbm\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883421 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883423 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-sys\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.883661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.883661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/053bb478-a29a-4c10-8bab-0896952ba633-metrics-client-ca\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-sys\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-tls\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-wtmp\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q65jm\" (UniqueName: \"kubernetes.io/projected/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-kube-api-access-q65jm\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:31.883822 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-root\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.883869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-textfile\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.883891 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:31.883893 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls podName:cb2b182b-d0b0-43b9-b51f-f2ed68482a29 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:32.383869274 +0000 UTC m=+168.210579669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-t5bf8" (UID: "cb2b182b-d0b0-43b9-b51f-f2ed68482a29") : secret "openshift-state-metrics-tls" not found Apr 16 14:32:31.884358 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.884169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-textfile\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.884358 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.884257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/053bb478-a29a-4c10-8bab-0896952ba633-metrics-client-ca\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.884358 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.884334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-root\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.884358 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.884338 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.884570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.884465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-wtmp\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.884878 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.884857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-accelerators-collector-config\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.886277 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.886255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-tls\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.886558 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.886516 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/053bb478-a29a-4c10-8bab-0896952ba633-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.886617 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.886601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:31.893429 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.893404 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrbm\" (UniqueName: \"kubernetes.io/projected/053bb478-a29a-4c10-8bab-0896952ba633-kube-api-access-wxrbm\") pod \"node-exporter-cmxww\" (UID: \"053bb478-a29a-4c10-8bab-0896952ba633\") " pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:31.893722 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:31.893704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q65jm\" (UniqueName: \"kubernetes.io/projected/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-kube-api-access-q65jm\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:32.032189 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:32.032120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cmxww" Apr 16 14:32:32.041938 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:32.041900 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod053bb478_a29a_4c10_8bab_0896952ba633.slice/crio-2152d44b76454e5f01ba1bfa7787ecdba5ab3c6e287cda9eb0963cf6af885b5d WatchSource:0}: Error finding container 2152d44b76454e5f01ba1bfa7787ecdba5ab3c6e287cda9eb0963cf6af885b5d: Status 404 returned error can't find the container with id 2152d44b76454e5f01ba1bfa7787ecdba5ab3c6e287cda9eb0963cf6af885b5d Apr 16 14:32:32.376829 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:32.376783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmxww" event={"ID":"053bb478-a29a-4c10-8bab-0896952ba633","Type":"ContainerStarted","Data":"2152d44b76454e5f01ba1bfa7787ecdba5ab3c6e287cda9eb0963cf6af885b5d"} Apr 16 14:32:32.387572 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:32.387546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:32.387697 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:32.387668 2576 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 14:32:32.387741 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:32.387718 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls podName:cb2b182b-d0b0-43b9-b51f-f2ed68482a29 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:33.387704516 +0000 UTC m=+169.214414912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-t5bf8" (UID: "cb2b182b-d0b0-43b9-b51f-f2ed68482a29") : secret "openshift-state-metrics-tls" not found Apr 16 14:32:33.380594 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.380494 2576 generic.go:358] "Generic (PLEG): container finished" podID="053bb478-a29a-4c10-8bab-0896952ba633" containerID="e1194957dda49dbe092f72f8162ab52347f5404f3ad2193ede456bd51ed4e26d" exitCode=0 Apr 16 14:32:33.380594 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.380562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmxww" event={"ID":"053bb478-a29a-4c10-8bab-0896952ba633","Type":"ContainerDied","Data":"e1194957dda49dbe092f72f8162ab52347f5404f3ad2193ede456bd51ed4e26d"} Apr 16 14:32:33.396213 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.396188 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:33.398610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.398588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb2b182b-d0b0-43b9-b51f-f2ed68482a29-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-t5bf8\" (UID: \"cb2b182b-d0b0-43b9-b51f-f2ed68482a29\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:33.486721 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.486696 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" Apr 16 14:32:33.622510 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.622473 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8"] Apr 16 14:32:33.625587 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:33.625522 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb2b182b_d0b0_43b9_b51f_f2ed68482a29.slice/crio-4108636d29322cf59252982bb3b5a859df865834a13a93351aeafa556a5b9c2e WatchSource:0}: Error finding container 4108636d29322cf59252982bb3b5a859df865834a13a93351aeafa556a5b9c2e: Status 404 returned error can't find the container with id 4108636d29322cf59252982bb3b5a859df865834a13a93351aeafa556a5b9c2e Apr 16 14:32:33.777937 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.777905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:33.778088 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.777950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:32:33.779346 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.779319 2576 patch_prober.go:28] interesting pod/console-5655bf44bb-z5sn5 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" start-of-body= Apr 16 14:32:33.779437 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:33.779363 2576 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-5655bf44bb-z5sn5" podUID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" containerName="console" probeResult="failure" output="Get \"https://10.133.0.17:8443/health\": dial tcp 10.133.0.17:8443: connect: connection refused" Apr 16 14:32:34.388087 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.388020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmxww" event={"ID":"053bb478-a29a-4c10-8bab-0896952ba633","Type":"ContainerStarted","Data":"70e04ab36c34754a458074458db04c8440628792aa96a3b257943af1462c318e"} Apr 16 14:32:34.388087 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.388076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cmxww" event={"ID":"053bb478-a29a-4c10-8bab-0896952ba633","Type":"ContainerStarted","Data":"ceefa965ecf90483ea22b91e14bfd547b2a83606260d035082d2a296e1246a0c"} Apr 16 14:32:34.390443 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.390414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" event={"ID":"cb2b182b-d0b0-43b9-b51f-f2ed68482a29","Type":"ContainerStarted","Data":"a489b5122d4fdbe18ed44fce4a00cf6496b9f9dd68623761c833f47673a9d0b0"} Apr 16 14:32:34.390610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.390447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" event={"ID":"cb2b182b-d0b0-43b9-b51f-f2ed68482a29","Type":"ContainerStarted","Data":"5bb4bb4945779c445d1a2c2de88f5f4656011c0cf45d91639208fe3df58316b0"} Apr 16 14:32:34.390610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.390461 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" event={"ID":"cb2b182b-d0b0-43b9-b51f-f2ed68482a29","Type":"ContainerStarted","Data":"4108636d29322cf59252982bb3b5a859df865834a13a93351aeafa556a5b9c2e"} Apr 16 14:32:34.411114 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.411063 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cmxww" podStartSLOduration=2.679692498 podStartE2EDuration="3.411045687s" podCreationTimestamp="2026-04-16 14:32:31 +0000 UTC" firstStartedPulling="2026-04-16 14:32:32.043582424 +0000 UTC m=+167.870292820" lastFinishedPulling="2026-04-16 14:32:32.7749356 +0000 UTC m=+168.601646009" observedRunningTime="2026-04-16 14:32:34.409646156 +0000 UTC m=+170.236356575" watchObservedRunningTime="2026-04-16 14:32:34.411045687 +0000 UTC m=+170.237756104" Apr 16 14:32:34.703637 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.703609 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6db4656d8f-fq84c"] Apr 16 14:32:34.707340 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.707325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.710751 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.710727 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-2wgtw\"" Apr 16 14:32:34.711030 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.711010 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-18vulkvtq9tup\"" Apr 16 14:32:34.711030 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.711022 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 14:32:34.711405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.711242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 14:32:34.711405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.711323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 14:32:34.711405 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.711384 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 14:32:34.718424 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.718393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 14:32:34.723409 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.723384 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6db4656d8f-fq84c"] Apr 16 14:32:34.808541 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808501 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808711 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-tls\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808711 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808790 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-metrics-client-ca\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808790 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-grpc-tls\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808790 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dw2m\" (UniqueName: \"kubernetes.io/projected/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-kube-api-access-7dw2m\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808880 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808839 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.808880 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.808861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909421 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909421 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-tls\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909673 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909673 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-metrics-client-ca\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909673 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-grpc-tls\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909781 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dw2m\" (UniqueName: \"kubernetes.io/projected/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-kube-api-access-7dw2m\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909781 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.909878 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.909804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.910396 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.910367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-metrics-client-ca\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.912423 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.912386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.912561 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.912520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-tls\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.912648 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.912609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.912705 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.912692 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.912759 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.912744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.912804 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.912769 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-secret-grpc-tls\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:34.919190 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:34.919165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dw2m\" (UniqueName: \"kubernetes.io/projected/4cc933c7-84a7-44d3-85d7-7dd10e8c29e7-kube-api-access-7dw2m\") pod \"thanos-querier-6db4656d8f-fq84c\" (UID: \"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7\") " pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:35.036899 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.036805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:35.164113 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.164079 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6db4656d8f-fq84c"] Apr 16 14:32:35.167574 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:35.167520 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cc933c7_84a7_44d3_85d7_7dd10e8c29e7.slice/crio-1ca4e2d03ac5ab9856b3e82b9706eba7247e863e68cd2c71f2f3e55fe4fc8ea4 WatchSource:0}: Error finding container 1ca4e2d03ac5ab9856b3e82b9706eba7247e863e68cd2c71f2f3e55fe4fc8ea4: Status 404 returned error can't find the container with id 1ca4e2d03ac5ab9856b3e82b9706eba7247e863e68cd2c71f2f3e55fe4fc8ea4 Apr 16 14:32:35.394333 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.394290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"1ca4e2d03ac5ab9856b3e82b9706eba7247e863e68cd2c71f2f3e55fe4fc8ea4"} Apr 16 14:32:35.396084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.396056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" event={"ID":"cb2b182b-d0b0-43b9-b51f-f2ed68482a29","Type":"ContainerStarted","Data":"a6d7fc9bc264434f2bcee6a10461fb15732a2e1f67db9f21e07639f4dfb20dfe"} Apr 16 14:32:35.413375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.413332 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-t5bf8" podStartSLOduration=3.491321047 podStartE2EDuration="4.413319341s" podCreationTimestamp="2026-04-16 14:32:31 +0000 UTC" firstStartedPulling="2026-04-16 14:32:33.7555549 +0000 UTC m=+169.582265300" lastFinishedPulling="2026-04-16 14:32:34.677553186 +0000 UTC m=+170.504263594" observedRunningTime="2026-04-16 14:32:35.412741413 +0000 UTC m=+171.239451830" watchObservedRunningTime="2026-04-16 14:32:35.413319341 +0000 UTC m=+171.240029757" Apr 16 14:32:35.846871 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.846772 2576 patch_prober.go:28] interesting pod/image-registry-7759f9cdf-skkws container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:32:35.846871 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:35.846835 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" podUID="426b5a99-d698-419c-b42f-63c90eadfa2b" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:32:36.454082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.454044 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k"] Apr 16 14:32:36.458330 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.458311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:36.460426 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.460400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 14:32:36.460553 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.460472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-p5str\"" Apr 16 14:32:36.467063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.466769 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k"] Apr 16 14:32:36.527667 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.527630 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0e272d7-9472-49b2-bfae-f11a54b2ad97-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-npx5k\" (UID: \"c0e272d7-9472-49b2-bfae-f11a54b2ad97\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:36.628371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:36.628329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0e272d7-9472-49b2-bfae-f11a54b2ad97-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-npx5k\" (UID: \"c0e272d7-9472-49b2-bfae-f11a54b2ad97\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:36.628561 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:36.628497 2576 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 14:32:36.628629 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:32:36.628586 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0e272d7-9472-49b2-bfae-f11a54b2ad97-monitoring-plugin-cert podName:c0e272d7-9472-49b2-bfae-f11a54b2ad97 nodeName:}" failed. No retries permitted until 2026-04-16 14:32:37.12856354 +0000 UTC m=+172.955273937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c0e272d7-9472-49b2-bfae-f11a54b2ad97-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-npx5k" (UID: "c0e272d7-9472-49b2-bfae-f11a54b2ad97") : secret "monitoring-plugin-cert" not found Apr 16 14:32:37.133963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.133921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0e272d7-9472-49b2-bfae-f11a54b2ad97-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-npx5k\" (UID: \"c0e272d7-9472-49b2-bfae-f11a54b2ad97\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:37.136392 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.136365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c0e272d7-9472-49b2-bfae-f11a54b2ad97-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-npx5k\" (UID: \"c0e272d7-9472-49b2-bfae-f11a54b2ad97\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:37.300084 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.299999 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7759f9cdf-skkws" Apr 16 14:32:37.371375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.371345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:37.411992 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.411959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"7b381b47fdeb82c44a3f8ba5d35cf91c3d8107414a6f4e128658639c01803bad"} Apr 16 14:32:37.411992 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.411995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"0f8730f6d6553241d815befa4ccfb5499312b5b97e4486e4043b58c5027cc808"} Apr 16 14:32:37.412191 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.412005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"611ba57965dc2e7c9dd232330a632c56868ecc65d8968f0d28d9da5df2022890"} Apr 16 14:32:37.501868 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.501840 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k"] Apr 16 14:32:37.504335 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:37.504307 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0e272d7_9472_49b2_bfae_f11a54b2ad97.slice/crio-664ab8715ca7d4625a6187e34885d57feaa4a4ef4d29ebf620095088429f8608 WatchSource:0}: Error finding container 664ab8715ca7d4625a6187e34885d57feaa4a4ef4d29ebf620095088429f8608: Status 404 returned error can't find the container with id 664ab8715ca7d4625a6187e34885d57feaa4a4ef4d29ebf620095088429f8608 Apr 16 14:32:37.659407 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.659367 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5655bf44bb-z5sn5"] Apr 16 14:32:37.697325 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.697094 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d8b7c7569-c5wc7"] Apr 16 14:32:37.700772 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.700745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.707980 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.707946 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 14:32:37.714184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.711988 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8b7c7569-c5wc7"] Apr 16 14:32:37.740594 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740499 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-oauth-serving-cert\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.740594 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-serving-cert\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.740864 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bg4j\" (UniqueName: \"kubernetes.io/projected/d60b1250-426f-4e15-8265-b3e5324dff48-kube-api-access-4bg4j\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.740864 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740698 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-console-config\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.740864 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-trusted-ca-bundle\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.740864 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-oauth-config\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.740864 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.740766 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-service-ca\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842190 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-oauth-serving-cert\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-serving-cert\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4bg4j\" (UniqueName: \"kubernetes.io/projected/d60b1250-426f-4e15-8265-b3e5324dff48-kube-api-access-4bg4j\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-console-config\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-trusted-ca-bundle\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-oauth-config\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.842636 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.842396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-service-ca\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.843265 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.843236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-service-ca\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.843960 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.843827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-oauth-serving-cert\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.844129 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.844100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-console-config\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.844387 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.844364 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-trusted-ca-bundle\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.846366 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.846335 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-oauth-config\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.846366 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.846358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-serving-cert\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:37.852641 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:37.852621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bg4j\" (UniqueName: \"kubernetes.io/projected/d60b1250-426f-4e15-8265-b3e5324dff48-kube-api-access-4bg4j\") pod \"console-6d8b7c7569-c5wc7\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:38.013285 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.013259 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:38.177025 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.176992 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8b7c7569-c5wc7"] Apr 16 14:32:38.181014 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:32:38.180975 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60b1250_426f_4e15_8265_b3e5324dff48.slice/crio-2661d3641ccc5e8507c6a19deaf38867ba401776736d2cb1d5ea90eef4803fe9 WatchSource:0}: Error finding container 2661d3641ccc5e8507c6a19deaf38867ba401776736d2cb1d5ea90eef4803fe9: Status 404 returned error can't find the container with id 2661d3641ccc5e8507c6a19deaf38867ba401776736d2cb1d5ea90eef4803fe9 Apr 16 14:32:38.420188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.420141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"5a6eff1e421dffd921e676baa1500c1e1df03fbf411436c6ffc2d0b2376fba14"} Apr 16 14:32:38.420188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.420186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"8104721d2c94bb1318fac2c3345880a6442aa17fa1e8a484db75a7dfb2ee9640"} Apr 16 14:32:38.420430 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.420201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" event={"ID":"4cc933c7-84a7-44d3-85d7-7dd10e8c29e7","Type":"ContainerStarted","Data":"49e2f1839506d26a33f1adf531fa99dfbf8c9d3a5fda9534b67ef6520ea6a17a"} Apr 16 14:32:38.420430 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.420332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:38.422175 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.421944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8b7c7569-c5wc7" event={"ID":"d60b1250-426f-4e15-8265-b3e5324dff48","Type":"ContainerStarted","Data":"3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632"} Apr 16 14:32:38.422175 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.421985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8b7c7569-c5wc7" event={"ID":"d60b1250-426f-4e15-8265-b3e5324dff48","Type":"ContainerStarted","Data":"2661d3641ccc5e8507c6a19deaf38867ba401776736d2cb1d5ea90eef4803fe9"} Apr 16 14:32:38.423235 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.423211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" event={"ID":"c0e272d7-9472-49b2-bfae-f11a54b2ad97","Type":"ContainerStarted","Data":"664ab8715ca7d4625a6187e34885d57feaa4a4ef4d29ebf620095088429f8608"} Apr 16 14:32:38.443553 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.443442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" podStartSLOduration=1.704716822 podStartE2EDuration="4.443422453s" podCreationTimestamp="2026-04-16 14:32:34 +0000 UTC" firstStartedPulling="2026-04-16 14:32:35.16938981 +0000 UTC m=+170.996100205" lastFinishedPulling="2026-04-16 14:32:37.908095438 +0000 UTC m=+173.734805836" observedRunningTime="2026-04-16 14:32:38.441275017 +0000 UTC m=+174.267985445" watchObservedRunningTime="2026-04-16 14:32:38.443422453 +0000 UTC m=+174.270132874" Apr 16 14:32:38.463068 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:38.463010 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d8b7c7569-c5wc7" podStartSLOduration=1.462995178 podStartE2EDuration="1.462995178s" podCreationTimestamp="2026-04-16 14:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:32:38.461483058 +0000 UTC m=+174.288193498" watchObservedRunningTime="2026-04-16 14:32:38.462995178 +0000 UTC m=+174.289705594" Apr 16 14:32:39.427091 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:39.427040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" event={"ID":"c0e272d7-9472-49b2-bfae-f11a54b2ad97","Type":"ContainerStarted","Data":"840d10628bf865ba1ba27214281c5e9233dbbd89e8b82b341503d775e0279cee"} Apr 16 14:32:39.427091 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:39.427098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:39.431861 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:39.431836 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" Apr 16 14:32:39.443433 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:39.443387 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-npx5k" podStartSLOduration=2.109143688 podStartE2EDuration="3.443373757s" podCreationTimestamp="2026-04-16 14:32:36 +0000 UTC" firstStartedPulling="2026-04-16 14:32:37.5061217 +0000 UTC m=+173.332832094" lastFinishedPulling="2026-04-16 14:32:38.840351764 +0000 UTC m=+174.667062163" observedRunningTime="2026-04-16 14:32:39.442242741 +0000 UTC m=+175.268953155" watchObservedRunningTime="2026-04-16 14:32:39.443373757 +0000 UTC m=+175.270084211" Apr 16 14:32:41.379016 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:41.378989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-99g2n" Apr 16 14:32:44.433271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:44.433245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6db4656d8f-fq84c" Apr 16 14:32:48.013902 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:48.013852 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:48.013902 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:48.013898 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:48.018624 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:48.018588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:48.459401 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:48.459375 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:32:56.479928 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:56.479894 2576 generic.go:358] "Generic (PLEG): container finished" podID="d1ee0bbf-24fd-4084-b536-8d53b20944b9" containerID="5008b15899279ced1093d41c0818f6e655a8cd1e422d65c83f1b5f0ea91a3b3b" exitCode=0 Apr 16 14:32:56.480313 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:56.479968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" event={"ID":"d1ee0bbf-24fd-4084-b536-8d53b20944b9","Type":"ContainerDied","Data":"5008b15899279ced1093d41c0818f6e655a8cd1e422d65c83f1b5f0ea91a3b3b"} Apr 16 14:32:56.480313 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:56.480309 2576 scope.go:117] "RemoveContainer" containerID="5008b15899279ced1093d41c0818f6e655a8cd1e422d65c83f1b5f0ea91a3b3b" Apr 16 14:32:57.485082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:32:57.485045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-d8v8k" event={"ID":"d1ee0bbf-24fd-4084-b536-8d53b20944b9","Type":"ContainerStarted","Data":"802d641fe102124b856e32ebde0e01a8e27bf2f1d53f481112f5c2c91cc8525b"} Apr 16 14:33:02.683231 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.683183 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5655bf44bb-z5sn5" podUID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" containerName="console" containerID="cri-o://48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4" gracePeriod=15 Apr 16 14:33:02.928526 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.928502 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5655bf44bb-z5sn5_b2ad56c7-d37f-486d-bc07-683c2aa2437e/console/0.log" Apr 16 14:33:02.928659 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.928581 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:33:02.968789 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.968693 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s774k\" (UniqueName: \"kubernetes.io/projected/b2ad56c7-d37f-486d-bc07-683c2aa2437e-kube-api-access-s774k\") pod \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " Apr 16 14:33:02.968789 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.968776 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-oauth-config\") pod \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " Apr 16 14:33:02.969026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.968808 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-service-ca\") pod \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " Apr 16 14:33:02.969026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.968839 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-serving-cert\") pod \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " Apr 16 14:33:02.969026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.968863 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-oauth-serving-cert\") pod \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " Apr 16 14:33:02.969268 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.969234 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-service-ca" (OuterVolumeSpecName: "service-ca") pod "b2ad56c7-d37f-486d-bc07-683c2aa2437e" (UID: "b2ad56c7-d37f-486d-bc07-683c2aa2437e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:02.969407 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.969239 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b2ad56c7-d37f-486d-bc07-683c2aa2437e" (UID: "b2ad56c7-d37f-486d-bc07-683c2aa2437e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:02.971086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.971055 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b2ad56c7-d37f-486d-bc07-683c2aa2437e" (UID: "b2ad56c7-d37f-486d-bc07-683c2aa2437e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:02.971189 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.971057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ad56c7-d37f-486d-bc07-683c2aa2437e-kube-api-access-s774k" (OuterVolumeSpecName: "kube-api-access-s774k") pod "b2ad56c7-d37f-486d-bc07-683c2aa2437e" (UID: "b2ad56c7-d37f-486d-bc07-683c2aa2437e"). InnerVolumeSpecName "kube-api-access-s774k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:02.971341 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:02.971318 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b2ad56c7-d37f-486d-bc07-683c2aa2437e" (UID: "b2ad56c7-d37f-486d-bc07-683c2aa2437e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:03.070259 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070221 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-config\") pod \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\" (UID: \"b2ad56c7-d37f-486d-bc07-683c2aa2437e\") " Apr 16 14:33:03.070450 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070414 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s774k\" (UniqueName: \"kubernetes.io/projected/b2ad56c7-d37f-486d-bc07-683c2aa2437e-kube-api-access-s774k\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.070450 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070427 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-oauth-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.070450 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070437 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-service-ca\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.070450 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070446 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.070695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070455 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-oauth-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.070695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.070641 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-config" (OuterVolumeSpecName: "console-config") pod "b2ad56c7-d37f-486d-bc07-683c2aa2437e" (UID: "b2ad56c7-d37f-486d-bc07-683c2aa2437e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:33:03.171867 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.171829 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2ad56c7-d37f-486d-bc07-683c2aa2437e-console-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:33:03.504237 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.504207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5655bf44bb-z5sn5_b2ad56c7-d37f-486d-bc07-683c2aa2437e/console/0.log" Apr 16 14:33:03.504412 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.504248 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" containerID="48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4" exitCode=2 Apr 16 14:33:03.504412 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.504284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5655bf44bb-z5sn5" event={"ID":"b2ad56c7-d37f-486d-bc07-683c2aa2437e","Type":"ContainerDied","Data":"48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4"} Apr 16 14:33:03.504412 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.504307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5655bf44bb-z5sn5" event={"ID":"b2ad56c7-d37f-486d-bc07-683c2aa2437e","Type":"ContainerDied","Data":"fc6b21151fb147e06df6d0b74c6c71f50a96b432f5405373d27f97932decbf03"} Apr 16 14:33:03.504412 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.504322 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5655bf44bb-z5sn5" Apr 16 14:33:03.504412 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.504330 2576 scope.go:117] "RemoveContainer" containerID="48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4" Apr 16 14:33:03.513587 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.513563 2576 scope.go:117] "RemoveContainer" containerID="48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4" Apr 16 14:33:03.513886 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:33:03.513865 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4\": container with ID starting with 48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4 not found: ID does not exist" containerID="48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4" Apr 16 14:33:03.513926 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.513896 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4"} err="failed to get container status \"48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4\": rpc error: code = NotFound desc = could not find container \"48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4\": container with ID starting with 48eb2158891286cca482b5d9b32832ba3fcd483e76ac1ab1b4b6a0ffc6f53fb4 not found: ID does not exist" Apr 16 14:33:03.528178 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.528148 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5655bf44bb-z5sn5"] Apr 16 14:33:03.533742 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:03.533720 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5655bf44bb-z5sn5"] Apr 16 14:33:04.781629 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:04.781592 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" path="/var/lib/kubelet/pods/b2ad56c7-d37f-486d-bc07-683c2aa2437e/volumes" Apr 16 14:33:11.533243 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:11.533205 2576 generic.go:358] "Generic (PLEG): container finished" podID="1a2c84e9-4fda-4619-a86a-36c809b14446" containerID="df277f6edc3268218d003789c83f5654e05d15a9d8ab55276722ce5196649c6a" exitCode=0 Apr 16 14:33:11.533656 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:11.533279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" event={"ID":"1a2c84e9-4fda-4619-a86a-36c809b14446","Type":"ContainerDied","Data":"df277f6edc3268218d003789c83f5654e05d15a9d8ab55276722ce5196649c6a"} Apr 16 14:33:11.533701 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:11.533661 2576 scope.go:117] "RemoveContainer" containerID="df277f6edc3268218d003789c83f5654e05d15a9d8ab55276722ce5196649c6a" Apr 16 14:33:12.538470 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:12.538426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-7bpgr" event={"ID":"1a2c84e9-4fda-4619-a86a-36c809b14446","Type":"ContainerStarted","Data":"1903e5eed06635d74abd8622a077b443cbe643a954924e9aed1ea40a6fb1cc33"} Apr 16 14:33:21.567628 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:21.567588 2576 generic.go:358] "Generic (PLEG): container finished" podID="05104d65-51f0-434e-945d-2714c95daadd" containerID="48018831df0279fa95aebed63289786cc06ffc8b44bc693b0c175be7d3805229" exitCode=0 Apr 16 14:33:21.568027 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:21.567663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" event={"ID":"05104d65-51f0-434e-945d-2714c95daadd","Type":"ContainerDied","Data":"48018831df0279fa95aebed63289786cc06ffc8b44bc693b0c175be7d3805229"} Apr 16 14:33:21.568027 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:21.568008 2576 scope.go:117] "RemoveContainer" containerID="48018831df0279fa95aebed63289786cc06ffc8b44bc693b0c175be7d3805229" Apr 16 14:33:22.572046 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:22.572011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-cnkw9" event={"ID":"05104d65-51f0-434e-945d-2714c95daadd","Type":"ContainerStarted","Data":"847b8afd8d8e7bf9f237c2eb2c46597ac39490bd7938f23cccbba807256cc91c"} Apr 16 14:33:56.649570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:56.649515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:33:56.652189 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:56.652159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bfa07533-d734-4829-bde0-6c0327bd79a9-metrics-certs\") pod \"network-metrics-daemon-kbtb7\" (UID: \"bfa07533-d734-4829-bde0-6c0327bd79a9\") " pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:33:56.680471 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:56.680444 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bl9dn\"" Apr 16 14:33:56.688312 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:56.688291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kbtb7" Apr 16 14:33:56.809395 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:56.809252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kbtb7"] Apr 16 14:33:56.812058 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:33:56.812030 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa07533_d734_4829_bde0_6c0327bd79a9.slice/crio-ff71ef4e0df89b835ffe1e8e82e76483b3cc6a24647e1db26639c09c7d55516e WatchSource:0}: Error finding container ff71ef4e0df89b835ffe1e8e82e76483b3cc6a24647e1db26639c09c7d55516e: Status 404 returned error can't find the container with id ff71ef4e0df89b835ffe1e8e82e76483b3cc6a24647e1db26639c09c7d55516e Apr 16 14:33:57.678552 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:57.678496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbtb7" event={"ID":"bfa07533-d734-4829-bde0-6c0327bd79a9","Type":"ContainerStarted","Data":"ff71ef4e0df89b835ffe1e8e82e76483b3cc6a24647e1db26639c09c7d55516e"} Apr 16 14:33:58.682622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:58.682580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbtb7" event={"ID":"bfa07533-d734-4829-bde0-6c0327bd79a9","Type":"ContainerStarted","Data":"9a19bb0181f9cc5fab501fa167c7da180cee4ada8b0d0b67dc388eaf729d460d"} Apr 16 14:33:58.683020 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:58.682628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kbtb7" event={"ID":"bfa07533-d734-4829-bde0-6c0327bd79a9","Type":"ContainerStarted","Data":"c3f552b85cec9d506adbcca4e070e0dba3227f037f2c83857f0c3f2923198aeb"} Apr 16 14:33:58.700446 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:33:58.700380 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kbtb7" podStartSLOduration=253.755137579 podStartE2EDuration="4m14.700361652s" podCreationTimestamp="2026-04-16 14:29:44 +0000 UTC" firstStartedPulling="2026-04-16 14:33:56.813908668 +0000 UTC m=+252.640619062" lastFinishedPulling="2026-04-16 14:33:57.759132736 +0000 UTC m=+253.585843135" observedRunningTime="2026-04-16 14:33:58.700121598 +0000 UTC m=+254.526832012" watchObservedRunningTime="2026-04-16 14:33:58.700361652 +0000 UTC m=+254.527072070" Apr 16 14:34:00.664079 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.664043 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b8dd7b4f4-sntz7"] Apr 16 14:34:00.664727 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.664704 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" containerName="console" Apr 16 14:34:00.664822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.664729 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" containerName="console" Apr 16 14:34:00.664884 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.664875 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2ad56c7-d37f-486d-bc07-683c2aa2437e" containerName="console" Apr 16 14:34:00.667911 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.667887 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.679624 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.679599 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b8dd7b4f4-sntz7"] Apr 16 14:34:00.782990 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.782962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-service-ca\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.783182 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.782994 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-console-config\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.783182 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.783020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-oauth-serving-cert\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.783182 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.783103 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-oauth-config\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.783182 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.783160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-trusted-ca-bundle\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.783349 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.783204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vkkp\" (UniqueName: \"kubernetes.io/projected/655b7dd2-ab87-497f-bec0-0861a5df5874-kube-api-access-7vkkp\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.783349 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.783248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-serving-cert\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.884610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-serving-cert\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.884810 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-service-ca\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.884810 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-console-config\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.884810 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-oauth-serving-cert\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.884810 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-oauth-config\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.885017 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-trusted-ca-bundle\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.885017 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.884865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vkkp\" (UniqueName: \"kubernetes.io/projected/655b7dd2-ab87-497f-bec0-0861a5df5874-kube-api-access-7vkkp\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.885390 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.885365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-console-config\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.885390 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.885384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-service-ca\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.885578 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.885396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-oauth-serving-cert\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.885854 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.885827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-trusted-ca-bundle\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.887191 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.887167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-serving-cert\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.887356 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.887339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-oauth-config\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.893700 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.893680 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vkkp\" (UniqueName: \"kubernetes.io/projected/655b7dd2-ab87-497f-bec0-0861a5df5874-kube-api-access-7vkkp\") pod \"console-7b8dd7b4f4-sntz7\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:00.977673 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:00.977586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:01.097998 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:01.097859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b8dd7b4f4-sntz7"] Apr 16 14:34:01.100967 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:34:01.100940 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod655b7dd2_ab87_497f_bec0_0861a5df5874.slice/crio-e0e5e65eacc55bfd4e7e2671533afdb95e86f2248bb11d5d8a3ca8e317352e23 WatchSource:0}: Error finding container e0e5e65eacc55bfd4e7e2671533afdb95e86f2248bb11d5d8a3ca8e317352e23: Status 404 returned error can't find the container with id e0e5e65eacc55bfd4e7e2671533afdb95e86f2248bb11d5d8a3ca8e317352e23 Apr 16 14:34:01.694235 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:01.694197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8dd7b4f4-sntz7" event={"ID":"655b7dd2-ab87-497f-bec0-0861a5df5874","Type":"ContainerStarted","Data":"d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0"} Apr 16 14:34:01.694235 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:01.694241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8dd7b4f4-sntz7" event={"ID":"655b7dd2-ab87-497f-bec0-0861a5df5874","Type":"ContainerStarted","Data":"e0e5e65eacc55bfd4e7e2671533afdb95e86f2248bb11d5d8a3ca8e317352e23"} Apr 16 14:34:01.712686 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:01.712633 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b8dd7b4f4-sntz7" podStartSLOduration=1.7126190650000002 podStartE2EDuration="1.712619065s" podCreationTimestamp="2026-04-16 14:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:34:01.71228141 +0000 UTC m=+257.538991828" watchObservedRunningTime="2026-04-16 14:34:01.712619065 +0000 UTC m=+257.539329481" Apr 16 14:34:10.979185 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:10.979087 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:10.979846 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:10.979823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:10.983390 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:10.983369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:11.730461 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:11.730433 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:34:11.781725 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:11.781689 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d8b7c7569-c5wc7"] Apr 16 14:34:36.801795 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:36.801736 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d8b7c7569-c5wc7" podUID="d60b1250-426f-4e15-8265-b3e5324dff48" containerName="console" containerID="cri-o://3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632" gracePeriod=15 Apr 16 14:34:37.043949 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.043924 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d8b7c7569-c5wc7_d60b1250-426f-4e15-8265-b3e5324dff48/console/0.log" Apr 16 14:34:37.044094 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.043987 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:34:37.083076 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.082986 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-serving-cert\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083076 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083046 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bg4j\" (UniqueName: \"kubernetes.io/projected/d60b1250-426f-4e15-8265-b3e5324dff48-kube-api-access-4bg4j\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083084 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-console-config\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083117 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-trusted-ca-bundle\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083170 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-service-ca\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083210 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-oauth-serving-cert\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083250 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-oauth-config\") pod \"d60b1250-426f-4e15-8265-b3e5324dff48\" (UID: \"d60b1250-426f-4e15-8265-b3e5324dff48\") " Apr 16 14:34:37.083647 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083616 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-console-config" (OuterVolumeSpecName: "console-config") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:34:37.083727 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083639 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:34:37.083790 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083735 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:34:37.083871 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.083852 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-service-ca" (OuterVolumeSpecName: "service-ca") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:34:37.085336 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.085316 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:34:37.085445 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.085394 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:34:37.085506 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.085473 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60b1250-426f-4e15-8265-b3e5324dff48-kube-api-access-4bg4j" (OuterVolumeSpecName: "kube-api-access-4bg4j") pod "d60b1250-426f-4e15-8265-b3e5324dff48" (UID: "d60b1250-426f-4e15-8265-b3e5324dff48"). InnerVolumeSpecName "kube-api-access-4bg4j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:34:37.184410 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184370 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.184410 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184404 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4bg4j\" (UniqueName: \"kubernetes.io/projected/d60b1250-426f-4e15-8265-b3e5324dff48-kube-api-access-4bg4j\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.184410 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184418 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-console-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.184692 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184432 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-trusted-ca-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.184692 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184446 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-service-ca\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.184692 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184458 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d60b1250-426f-4e15-8265-b3e5324dff48-oauth-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.184692 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.184470 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d60b1250-426f-4e15-8265-b3e5324dff48-console-oauth-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:34:37.808218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.808190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d8b7c7569-c5wc7_d60b1250-426f-4e15-8265-b3e5324dff48/console/0.log" Apr 16 14:34:37.808702 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.808231 2576 generic.go:358] "Generic (PLEG): container finished" podID="d60b1250-426f-4e15-8265-b3e5324dff48" containerID="3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632" exitCode=2 Apr 16 14:34:37.808702 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.808266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8b7c7569-c5wc7" event={"ID":"d60b1250-426f-4e15-8265-b3e5324dff48","Type":"ContainerDied","Data":"3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632"} Apr 16 14:34:37.808702 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.808291 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8b7c7569-c5wc7" Apr 16 14:34:37.808702 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.808314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8b7c7569-c5wc7" event={"ID":"d60b1250-426f-4e15-8265-b3e5324dff48","Type":"ContainerDied","Data":"2661d3641ccc5e8507c6a19deaf38867ba401776736d2cb1d5ea90eef4803fe9"} Apr 16 14:34:37.808702 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.808336 2576 scope.go:117] "RemoveContainer" containerID="3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632" Apr 16 14:34:37.817017 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.817000 2576 scope.go:117] "RemoveContainer" containerID="3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632" Apr 16 14:34:37.817345 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:34:37.817326 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632\": container with ID starting with 3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632 not found: ID does not exist" containerID="3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632" Apr 16 14:34:37.817391 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.817353 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632"} err="failed to get container status \"3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632\": rpc error: code = NotFound desc = could not find container \"3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632\": container with ID starting with 3704e00d241645c88557709c0bfd3e3210a59739c9910c2c3d350952b032a632 not found: ID does not exist" Apr 16 14:34:37.831745 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.831716 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d8b7c7569-c5wc7"] Apr 16 14:34:37.836963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:37.836928 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d8b7c7569-c5wc7"] Apr 16 14:34:38.781724 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:38.781688 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60b1250-426f-4e15-8265-b3e5324dff48" path="/var/lib/kubelet/pods/d60b1250-426f-4e15-8265-b3e5324dff48/volumes" Apr 16 14:34:44.642380 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:44.642343 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:34:44.642901 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:44.642508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:34:44.646276 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:34:44.646256 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:35:10.936292 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:10.936253 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6478cddcdb-dwk27"] Apr 16 14:35:10.938660 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:10.936587 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d60b1250-426f-4e15-8265-b3e5324dff48" containerName="console" Apr 16 14:35:10.938660 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:10.936599 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60b1250-426f-4e15-8265-b3e5324dff48" containerName="console" Apr 16 14:35:10.938660 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:10.936653 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d60b1250-426f-4e15-8265-b3e5324dff48" containerName="console" Apr 16 14:35:10.939552 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:10.939519 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:10.949102 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:10.949069 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6478cddcdb-dwk27"] Apr 16 14:35:11.072366 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072318 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hsp\" (UniqueName: \"kubernetes.io/projected/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-kube-api-access-78hsp\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.072366 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072366 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-serving-cert\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.072622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-config\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.072622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-service-ca\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.072622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072560 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-trusted-ca-bundle\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.072622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072604 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-oauth-config\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.072622 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.072620 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-oauth-serving-cert\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173142 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173105 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78hsp\" (UniqueName: \"kubernetes.io/projected/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-kube-api-access-78hsp\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-serving-cert\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-config\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-service-ca\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-trusted-ca-bundle\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-oauth-config\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.173570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-oauth-serving-cert\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.174017 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.173983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-config\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.174125 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.174053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-service-ca\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.174125 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.174055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-oauth-serving-cert\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.174290 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.174260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-trusted-ca-bundle\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.175891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.175867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-serving-cert\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.176167 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.176149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-oauth-config\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.182472 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.182448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hsp\" (UniqueName: \"kubernetes.io/projected/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-kube-api-access-78hsp\") pod \"console-6478cddcdb-dwk27\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.251479 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.251388 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:11.376959 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.376934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6478cddcdb-dwk27"] Apr 16 14:35:11.379005 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:35:11.378977 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d27a9c_0fdf_4aba_a99b_14f1baf7c2e8.slice/crio-380fcdfd672f11635eb050306de282b4bc5a0066a4f050e54a801d8c17153003 WatchSource:0}: Error finding container 380fcdfd672f11635eb050306de282b4bc5a0066a4f050e54a801d8c17153003: Status 404 returned error can't find the container with id 380fcdfd672f11635eb050306de282b4bc5a0066a4f050e54a801d8c17153003 Apr 16 14:35:11.380786 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.380769 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:35:11.915885 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.915843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6478cddcdb-dwk27" event={"ID":"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8","Type":"ContainerStarted","Data":"0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee"} Apr 16 14:35:11.915885 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.915885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6478cddcdb-dwk27" event={"ID":"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8","Type":"ContainerStarted","Data":"380fcdfd672f11635eb050306de282b4bc5a0066a4f050e54a801d8c17153003"} Apr 16 14:35:11.935271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:11.935215 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6478cddcdb-dwk27" podStartSLOduration=1.935198312 podStartE2EDuration="1.935198312s" podCreationTimestamp="2026-04-16 14:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:35:11.934111015 +0000 UTC m=+327.760821433" watchObservedRunningTime="2026-04-16 14:35:11.935198312 +0000 UTC m=+327.761908729" Apr 16 14:35:21.252214 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:21.252170 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:21.252214 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:21.252216 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:21.257181 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:21.257156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:21.949017 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:21.948988 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:35:22.000799 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:22.000760 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b8dd7b4f4-sntz7"] Apr 16 14:35:47.021518 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.021406 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b8dd7b4f4-sntz7" podUID="655b7dd2-ab87-497f-bec0-0861a5df5874" containerName="console" containerID="cri-o://d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0" gracePeriod=15 Apr 16 14:35:47.265644 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.265615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b8dd7b4f4-sntz7_655b7dd2-ab87-497f-bec0-0861a5df5874/console/0.log" Apr 16 14:35:47.265806 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.265694 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:35:47.378695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378668 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-oauth-serving-cert\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.378876 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378707 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-trusted-ca-bundle\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.378876 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378819 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-service-ca\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.378876 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378860 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-console-config\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.379047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-oauth-config\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.379047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378949 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-serving-cert\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.379047 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.378978 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vkkp\" (UniqueName: \"kubernetes.io/projected/655b7dd2-ab87-497f-bec0-0861a5df5874-kube-api-access-7vkkp\") pod \"655b7dd2-ab87-497f-bec0-0861a5df5874\" (UID: \"655b7dd2-ab87-497f-bec0-0861a5df5874\") " Apr 16 14:35:47.379194 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.379161 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:35:47.379253 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.379185 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:35:47.379253 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.379237 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-oauth-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:47.379361 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.379309 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-service-ca" (OuterVolumeSpecName: "service-ca") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:35:47.379361 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.379314 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-console-config" (OuterVolumeSpecName: "console-config") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:35:47.381138 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.381118 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:35:47.381263 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.381246 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/655b7dd2-ab87-497f-bec0-0861a5df5874-kube-api-access-7vkkp" (OuterVolumeSpecName: "kube-api-access-7vkkp") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "kube-api-access-7vkkp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:35:47.381321 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.381244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "655b7dd2-ab87-497f-bec0-0861a5df5874" (UID: "655b7dd2-ab87-497f-bec0-0861a5df5874"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:35:47.480065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.480022 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-trusted-ca-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:47.480065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.480059 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-service-ca\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:47.480065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.480068 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/655b7dd2-ab87-497f-bec0-0861a5df5874-console-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:47.480065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.480077 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-oauth-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:47.480312 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.480086 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/655b7dd2-ab87-497f-bec0-0861a5df5874-console-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:47.480312 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:47.480095 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7vkkp\" (UniqueName: \"kubernetes.io/projected/655b7dd2-ab87-497f-bec0-0861a5df5874-kube-api-access-7vkkp\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:35:48.021291 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.021264 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b8dd7b4f4-sntz7_655b7dd2-ab87-497f-bec0-0861a5df5874/console/0.log" Apr 16 14:35:48.021475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.021304 2576 generic.go:358] "Generic (PLEG): container finished" podID="655b7dd2-ab87-497f-bec0-0861a5df5874" containerID="d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0" exitCode=2 Apr 16 14:35:48.021475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.021343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8dd7b4f4-sntz7" event={"ID":"655b7dd2-ab87-497f-bec0-0861a5df5874","Type":"ContainerDied","Data":"d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0"} Apr 16 14:35:48.021475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.021365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b8dd7b4f4-sntz7" event={"ID":"655b7dd2-ab87-497f-bec0-0861a5df5874","Type":"ContainerDied","Data":"e0e5e65eacc55bfd4e7e2671533afdb95e86f2248bb11d5d8a3ca8e317352e23"} Apr 16 14:35:48.021475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.021379 2576 scope.go:117] "RemoveContainer" containerID="d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0" Apr 16 14:35:48.021475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.021376 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b8dd7b4f4-sntz7" Apr 16 14:35:48.029485 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.029463 2576 scope.go:117] "RemoveContainer" containerID="d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0" Apr 16 14:35:48.029794 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:35:48.029775 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0\": container with ID starting with d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0 not found: ID does not exist" containerID="d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0" Apr 16 14:35:48.029859 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.029803 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0"} err="failed to get container status \"d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0\": rpc error: code = NotFound desc = could not find container \"d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0\": container with ID starting with d709737c8f5094de24cc00e0b7a896b4090cf3a8bc53687e37721326fead71b0 not found: ID does not exist" Apr 16 14:35:48.043415 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.043381 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b8dd7b4f4-sntz7"] Apr 16 14:35:48.048341 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.048312 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b8dd7b4f4-sntz7"] Apr 16 14:35:48.781353 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:35:48.781314 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="655b7dd2-ab87-497f-bec0-0861a5df5874" path="/var/lib/kubelet/pods/655b7dd2-ab87-497f-bec0-0861a5df5874/volumes" Apr 16 14:37:09.778353 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.778266 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn"] Apr 16 14:37:09.778840 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.778610 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="655b7dd2-ab87-497f-bec0-0861a5df5874" containerName="console" Apr 16 14:37:09.778840 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.778622 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="655b7dd2-ab87-497f-bec0-0861a5df5874" containerName="console" Apr 16 14:37:09.778840 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.778680 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="655b7dd2-ab87-497f-bec0-0861a5df5874" containerName="console" Apr 16 14:37:09.781690 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.781674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.784201 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.784180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:37:09.784332 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.784187 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ftngz\"" Apr 16 14:37:09.785028 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.785012 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:37:09.791140 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.791111 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn"] Apr 16 14:37:09.868338 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.868297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.868338 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.868347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.868651 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.868488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt87d\" (UniqueName: \"kubernetes.io/projected/5eea32f2-f038-4325-91b3-c2116204c050-kube-api-access-mt87d\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.969600 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.969526 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt87d\" (UniqueName: \"kubernetes.io/projected/5eea32f2-f038-4325-91b3-c2116204c050-kube-api-access-mt87d\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.969803 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.969620 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.969803 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.969642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.970032 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.970009 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.970082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.970034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:09.978813 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:09.978782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt87d\" (UniqueName: \"kubernetes.io/projected/5eea32f2-f038-4325-91b3-c2116204c050-kube-api-access-mt87d\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:10.092246 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:10.092155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:10.220893 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:10.220853 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn"] Apr 16 14:37:10.223990 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:37:10.223960 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eea32f2_f038_4325_91b3_c2116204c050.slice/crio-82750d97d1dcb9cd08ecfb52a221e45f4dd1e78939318e610f405e2655aca87f WatchSource:0}: Error finding container 82750d97d1dcb9cd08ecfb52a221e45f4dd1e78939318e610f405e2655aca87f: Status 404 returned error can't find the container with id 82750d97d1dcb9cd08ecfb52a221e45f4dd1e78939318e610f405e2655aca87f Apr 16 14:37:10.260988 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:10.260943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" event={"ID":"5eea32f2-f038-4325-91b3-c2116204c050","Type":"ContainerStarted","Data":"82750d97d1dcb9cd08ecfb52a221e45f4dd1e78939318e610f405e2655aca87f"} Apr 16 14:37:15.281009 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:15.280906 2576 generic.go:358] "Generic (PLEG): container finished" podID="5eea32f2-f038-4325-91b3-c2116204c050" containerID="dc1a7d1739e3ce9ebac719f42c360ed8c55ae80535b130fe87eff413d35f7396" exitCode=0 Apr 16 14:37:15.281009 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:15.280997 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" event={"ID":"5eea32f2-f038-4325-91b3-c2116204c050","Type":"ContainerDied","Data":"dc1a7d1739e3ce9ebac719f42c360ed8c55ae80535b130fe87eff413d35f7396"} Apr 16 14:37:18.293338 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:18.293304 2576 generic.go:358] "Generic (PLEG): container finished" podID="5eea32f2-f038-4325-91b3-c2116204c050" containerID="edc966e793bdf20d4859d483aff71e92cfccb43e7d9203c48fdc5a98858bdbc5" exitCode=0 Apr 16 14:37:18.293764 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:18.293385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" event={"ID":"5eea32f2-f038-4325-91b3-c2116204c050","Type":"ContainerDied","Data":"edc966e793bdf20d4859d483aff71e92cfccb43e7d9203c48fdc5a98858bdbc5"} Apr 16 14:37:24.313307 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:24.313268 2576 generic.go:358] "Generic (PLEG): container finished" podID="5eea32f2-f038-4325-91b3-c2116204c050" containerID="d916d41d2a86946970f1bbdb7beb8b2f7dca997097f2b6df0e6864ab24135557" exitCode=0 Apr 16 14:37:24.313696 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:24.313323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" event={"ID":"5eea32f2-f038-4325-91b3-c2116204c050","Type":"ContainerDied","Data":"d916d41d2a86946970f1bbdb7beb8b2f7dca997097f2b6df0e6864ab24135557"} Apr 16 14:37:25.434311 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.434288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:25.509483 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.509446 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-util\") pod \"5eea32f2-f038-4325-91b3-c2116204c050\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " Apr 16 14:37:25.509676 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.509505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt87d\" (UniqueName: \"kubernetes.io/projected/5eea32f2-f038-4325-91b3-c2116204c050-kube-api-access-mt87d\") pod \"5eea32f2-f038-4325-91b3-c2116204c050\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " Apr 16 14:37:25.509676 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.509569 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-bundle\") pod \"5eea32f2-f038-4325-91b3-c2116204c050\" (UID: \"5eea32f2-f038-4325-91b3-c2116204c050\") " Apr 16 14:37:25.510316 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.510287 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-bundle" (OuterVolumeSpecName: "bundle") pod "5eea32f2-f038-4325-91b3-c2116204c050" (UID: "5eea32f2-f038-4325-91b3-c2116204c050"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:25.511769 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.511748 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eea32f2-f038-4325-91b3-c2116204c050-kube-api-access-mt87d" (OuterVolumeSpecName: "kube-api-access-mt87d") pod "5eea32f2-f038-4325-91b3-c2116204c050" (UID: "5eea32f2-f038-4325-91b3-c2116204c050"). InnerVolumeSpecName "kube-api-access-mt87d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:37:25.515367 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.515340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-util" (OuterVolumeSpecName: "util") pod "5eea32f2-f038-4325-91b3-c2116204c050" (UID: "5eea32f2-f038-4325-91b3-c2116204c050"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:25.611135 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.611088 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:25.611135 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.611133 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mt87d\" (UniqueName: \"kubernetes.io/projected/5eea32f2-f038-4325-91b3-c2116204c050-kube-api-access-mt87d\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:25.611135 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:25.611145 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5eea32f2-f038-4325-91b3-c2116204c050-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:26.320159 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:26.320119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" event={"ID":"5eea32f2-f038-4325-91b3-c2116204c050","Type":"ContainerDied","Data":"82750d97d1dcb9cd08ecfb52a221e45f4dd1e78939318e610f405e2655aca87f"} Apr 16 14:37:26.320159 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:26.320154 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82750d97d1dcb9cd08ecfb52a221e45f4dd1e78939318e610f405e2655aca87f" Apr 16 14:37:26.320159 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:26.320152 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5f68gn" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.857655 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p"] Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858356 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="util" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858382 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="util" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858403 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="extract" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858419 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="extract" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858437 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="pull" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858446 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="pull" Apr 16 14:37:31.860554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.858607 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5eea32f2-f038-4325-91b3-c2116204c050" containerName="extract" Apr 16 14:37:31.864351 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.864326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:31.867163 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.867128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:37:31.867361 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.867301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-m4qvx\"" Apr 16 14:37:31.867454 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.867379 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 14:37:31.868465 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.868445 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p"] Apr 16 14:37:31.962604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.962571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb6tg\" (UniqueName: \"kubernetes.io/projected/23a07936-d863-45f0-b683-c39e98cff402-kube-api-access-xb6tg\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-2pk7p\" (UID: \"23a07936-d863-45f0-b683-c39e98cff402\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:31.962604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:31.962611 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23a07936-d863-45f0-b683-c39e98cff402-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-2pk7p\" (UID: \"23a07936-d863-45f0-b683-c39e98cff402\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:32.063300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.063269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb6tg\" (UniqueName: \"kubernetes.io/projected/23a07936-d863-45f0-b683-c39e98cff402-kube-api-access-xb6tg\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-2pk7p\" (UID: \"23a07936-d863-45f0-b683-c39e98cff402\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:32.063300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.063307 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23a07936-d863-45f0-b683-c39e98cff402-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-2pk7p\" (UID: \"23a07936-d863-45f0-b683-c39e98cff402\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:32.063697 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.063681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23a07936-d863-45f0-b683-c39e98cff402-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-2pk7p\" (UID: \"23a07936-d863-45f0-b683-c39e98cff402\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:32.077750 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.077717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb6tg\" (UniqueName: \"kubernetes.io/projected/23a07936-d863-45f0-b683-c39e98cff402-kube-api-access-xb6tg\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-2pk7p\" (UID: \"23a07936-d863-45f0-b683-c39e98cff402\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:32.174932 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.174838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" Apr 16 14:37:32.304446 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.304336 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p"] Apr 16 14:37:32.307586 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:37:32.307555 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23a07936_d863_45f0_b683_c39e98cff402.slice/crio-590530b7360db249d79f2589edc2bc25415a95050df9e878b9763a81835a3ad9 WatchSource:0}: Error finding container 590530b7360db249d79f2589edc2bc25415a95050df9e878b9763a81835a3ad9: Status 404 returned error can't find the container with id 590530b7360db249d79f2589edc2bc25415a95050df9e878b9763a81835a3ad9 Apr 16 14:37:32.339737 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:32.339705 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" event={"ID":"23a07936-d863-45f0-b683-c39e98cff402","Type":"ContainerStarted","Data":"590530b7360db249d79f2589edc2bc25415a95050df9e878b9763a81835a3ad9"} Apr 16 14:37:35.350341 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:35.350298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" event={"ID":"23a07936-d863-45f0-b683-c39e98cff402","Type":"ContainerStarted","Data":"5577b6835bdcabc249ce7ab358193ca4873fb6aa1d85a69916bfee2047b1e2ee"} Apr 16 14:37:35.372493 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:35.372427 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-2pk7p" podStartSLOduration=2.377096116 podStartE2EDuration="4.372408193s" podCreationTimestamp="2026-04-16 14:37:31 +0000 UTC" firstStartedPulling="2026-04-16 14:37:32.310013736 +0000 UTC m=+468.136724132" lastFinishedPulling="2026-04-16 14:37:34.305325813 +0000 UTC m=+470.132036209" observedRunningTime="2026-04-16 14:37:35.37003389 +0000 UTC m=+471.196744307" watchObservedRunningTime="2026-04-16 14:37:35.372408193 +0000 UTC m=+471.199118612" Apr 16 14:37:36.565766 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.565729 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj"] Apr 16 14:37:36.569140 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.569121 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.572709 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.572683 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ftngz\"" Apr 16 14:37:36.572842 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.572728 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:37:36.572842 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.572743 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:37:36.584215 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.584190 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj"] Apr 16 14:37:36.606547 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.606495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.606689 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.606572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.606689 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.606595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq45l\" (UniqueName: \"kubernetes.io/projected/6fe5b746-4355-43ca-b5a6-81b53c221c6a-kube-api-access-jq45l\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.707789 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.707744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.707975 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.707911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.707975 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.707947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq45l\" (UniqueName: \"kubernetes.io/projected/6fe5b746-4355-43ca-b5a6-81b53c221c6a-kube-api-access-jq45l\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.708182 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.708162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.708267 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.708244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.717416 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.717390 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq45l\" (UniqueName: \"kubernetes.io/projected/6fe5b746-4355-43ca-b5a6-81b53c221c6a-kube-api-access-jq45l\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:36.878339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:36.878303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:37.005892 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:37.005855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj"] Apr 16 14:37:37.010162 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:37:37.010116 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fe5b746_4355_43ca_b5a6_81b53c221c6a.slice/crio-f9caabc366825803262d00149b4ab4c4d79056251ff07eeebe7046558d2b0b80 WatchSource:0}: Error finding container f9caabc366825803262d00149b4ab4c4d79056251ff07eeebe7046558d2b0b80: Status 404 returned error can't find the container with id f9caabc366825803262d00149b4ab4c4d79056251ff07eeebe7046558d2b0b80 Apr 16 14:37:37.360772 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:37.360720 2576 generic.go:358] "Generic (PLEG): container finished" podID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerID="923391c355cc295848cd9ae6c05ab4371bc9020f7913eb64e70cbe818ac4380e" exitCode=0 Apr 16 14:37:37.360964 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:37.360803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" event={"ID":"6fe5b746-4355-43ca-b5a6-81b53c221c6a","Type":"ContainerDied","Data":"923391c355cc295848cd9ae6c05ab4371bc9020f7913eb64e70cbe818ac4380e"} Apr 16 14:37:37.360964 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:37.360843 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" event={"ID":"6fe5b746-4355-43ca-b5a6-81b53c221c6a","Type":"ContainerStarted","Data":"f9caabc366825803262d00149b4ab4c4d79056251ff07eeebe7046558d2b0b80"} Apr 16 14:37:40.372674 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:40.372639 2576 generic.go:358] "Generic (PLEG): container finished" podID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerID="e222c13d8f8e1394ddb1ce99e47604f94ccb444deeb1c5fe1e77c5cb2c19f8bf" exitCode=0 Apr 16 14:37:40.373124 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:40.372724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" event={"ID":"6fe5b746-4355-43ca-b5a6-81b53c221c6a","Type":"ContainerDied","Data":"e222c13d8f8e1394ddb1ce99e47604f94ccb444deeb1c5fe1e77c5cb2c19f8bf"} Apr 16 14:37:41.378187 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:41.378151 2576 generic.go:358] "Generic (PLEG): container finished" podID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerID="096eede7de2205694ec0b4540f50a0fa963d252e2fe842ed970b3092690f4a15" exitCode=0 Apr 16 14:37:41.378589 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:41.378229 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" event={"ID":"6fe5b746-4355-43ca-b5a6-81b53c221c6a","Type":"ContainerDied","Data":"096eede7de2205694ec0b4540f50a0fa963d252e2fe842ed970b3092690f4a15"} Apr 16 14:37:42.500028 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.500001 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:42.557713 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.557670 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq45l\" (UniqueName: \"kubernetes.io/projected/6fe5b746-4355-43ca-b5a6-81b53c221c6a-kube-api-access-jq45l\") pod \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " Apr 16 14:37:42.557903 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.557740 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-bundle\") pod \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " Apr 16 14:37:42.557903 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.557810 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-util\") pod \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\" (UID: \"6fe5b746-4355-43ca-b5a6-81b53c221c6a\") " Apr 16 14:37:42.558147 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.558122 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-bundle" (OuterVolumeSpecName: "bundle") pod "6fe5b746-4355-43ca-b5a6-81b53c221c6a" (UID: "6fe5b746-4355-43ca-b5a6-81b53c221c6a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:42.559808 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.559785 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe5b746-4355-43ca-b5a6-81b53c221c6a-kube-api-access-jq45l" (OuterVolumeSpecName: "kube-api-access-jq45l") pod "6fe5b746-4355-43ca-b5a6-81b53c221c6a" (UID: "6fe5b746-4355-43ca-b5a6-81b53c221c6a"). InnerVolumeSpecName "kube-api-access-jq45l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:37:42.564388 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.564361 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-util" (OuterVolumeSpecName: "util") pod "6fe5b746-4355-43ca-b5a6-81b53c221c6a" (UID: "6fe5b746-4355-43ca-b5a6-81b53c221c6a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:42.658675 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.658560 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:42.658675 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.658611 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jq45l\" (UniqueName: \"kubernetes.io/projected/6fe5b746-4355-43ca-b5a6-81b53c221c6a-kube-api-access-jq45l\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:42.658675 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:42.658623 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6fe5b746-4355-43ca-b5a6-81b53c221c6a-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:43.387118 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:43.387080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" event={"ID":"6fe5b746-4355-43ca-b5a6-81b53c221c6a","Type":"ContainerDied","Data":"f9caabc366825803262d00149b4ab4c4d79056251ff07eeebe7046558d2b0b80"} Apr 16 14:37:43.387118 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:43.387115 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9caabc366825803262d00149b4ab4c4d79056251ff07eeebe7046558d2b0b80" Apr 16 14:37:43.387339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:43.387128 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fkk6bj" Apr 16 14:37:47.578351 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578316 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4"] Apr 16 14:37:47.578755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578690 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="extract" Apr 16 14:37:47.578755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578703 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="extract" Apr 16 14:37:47.578755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578718 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="util" Apr 16 14:37:47.578755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578724 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="util" Apr 16 14:37:47.578755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578734 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="pull" Apr 16 14:37:47.578755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578739 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="pull" Apr 16 14:37:47.578937 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.578804 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fe5b746-4355-43ca-b5a6-81b53c221c6a" containerName="extract" Apr 16 14:37:47.584152 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.584135 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.586884 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.586860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:37:47.587671 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.587656 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-9z8d2\"" Apr 16 14:37:47.587749 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.587695 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:37:47.591196 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.591097 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4"] Apr 16 14:37:47.701255 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.701214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7e8f785f-9735-4841-b782-7ff0ca25110a-tmp\") pod \"openshift-lws-operator-bfc7f696d-45br4\" (UID: \"7e8f785f-9735-4841-b782-7ff0ca25110a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.701255 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.701250 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9swj\" (UniqueName: \"kubernetes.io/projected/7e8f785f-9735-4841-b782-7ff0ca25110a-kube-api-access-w9swj\") pod \"openshift-lws-operator-bfc7f696d-45br4\" (UID: \"7e8f785f-9735-4841-b782-7ff0ca25110a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.802140 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.802102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7e8f785f-9735-4841-b782-7ff0ca25110a-tmp\") pod \"openshift-lws-operator-bfc7f696d-45br4\" (UID: \"7e8f785f-9735-4841-b782-7ff0ca25110a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.802140 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.802142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9swj\" (UniqueName: \"kubernetes.io/projected/7e8f785f-9735-4841-b782-7ff0ca25110a-kube-api-access-w9swj\") pod \"openshift-lws-operator-bfc7f696d-45br4\" (UID: \"7e8f785f-9735-4841-b782-7ff0ca25110a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.802590 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.802569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7e8f785f-9735-4841-b782-7ff0ca25110a-tmp\") pod \"openshift-lws-operator-bfc7f696d-45br4\" (UID: \"7e8f785f-9735-4841-b782-7ff0ca25110a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.811698 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.811669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9swj\" (UniqueName: \"kubernetes.io/projected/7e8f785f-9735-4841-b782-7ff0ca25110a-kube-api-access-w9swj\") pod \"openshift-lws-operator-bfc7f696d-45br4\" (UID: \"7e8f785f-9735-4841-b782-7ff0ca25110a\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:47.907631 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:47.907601 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" Apr 16 14:37:48.034497 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:48.034462 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4"] Apr 16 14:37:48.037722 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:37:48.037693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e8f785f_9735_4841_b782_7ff0ca25110a.slice/crio-586db22d27e0326b31d5724f12fb2379eedfa551d219da0e786ac12e843758e9 WatchSource:0}: Error finding container 586db22d27e0326b31d5724f12fb2379eedfa551d219da0e786ac12e843758e9: Status 404 returned error can't find the container with id 586db22d27e0326b31d5724f12fb2379eedfa551d219da0e786ac12e843758e9 Apr 16 14:37:48.405553 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:48.405490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" event={"ID":"7e8f785f-9735-4841-b782-7ff0ca25110a","Type":"ContainerStarted","Data":"586db22d27e0326b31d5724f12fb2379eedfa551d219da0e786ac12e843758e9"} Apr 16 14:37:50.415060 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:50.415012 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" event={"ID":"7e8f785f-9735-4841-b782-7ff0ca25110a","Type":"ContainerStarted","Data":"fb367911d0a32827221832437a3b653efcaecc79f891a0555e44d279a471d9e7"} Apr 16 14:37:50.431940 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:50.431884 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-45br4" podStartSLOduration=1.441657646 podStartE2EDuration="3.43186729s" podCreationTimestamp="2026-04-16 14:37:47 +0000 UTC" firstStartedPulling="2026-04-16 14:37:48.039277826 +0000 UTC m=+483.865988222" lastFinishedPulling="2026-04-16 14:37:50.029487458 +0000 UTC m=+485.856197866" observedRunningTime="2026-04-16 14:37:50.431520291 +0000 UTC m=+486.258230708" watchObservedRunningTime="2026-04-16 14:37:50.43186729 +0000 UTC m=+486.258577707" Apr 16 14:37:53.146958 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.146912 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd"] Apr 16 14:37:53.150756 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.150731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.153218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.153194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:37:53.153331 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.153194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:37:53.153970 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.153954 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ftngz\"" Apr 16 14:37:53.158822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.158801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd"] Apr 16 14:37:53.251472 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.251440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.251653 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.251487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.251653 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.251507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdh9\" (UniqueName: \"kubernetes.io/projected/0846db0a-e1c4-499b-bf67-bb24d6da069c-kube-api-access-2pdh9\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.352466 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.352428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.352688 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.352475 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.352688 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.352494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdh9\" (UniqueName: \"kubernetes.io/projected/0846db0a-e1c4-499b-bf67-bb24d6da069c-kube-api-access-2pdh9\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.352872 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.352851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.352937 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.352892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.362797 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.362768 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdh9\" (UniqueName: \"kubernetes.io/projected/0846db0a-e1c4-499b-bf67-bb24d6da069c-kube-api-access-2pdh9\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.460804 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.460707 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:53.588033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:53.588004 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd"] Apr 16 14:37:53.590128 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:37:53.590095 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0846db0a_e1c4_499b_bf67_bb24d6da069c.slice/crio-a4c95b412a537872f3886119aa35fbd786b35330e210fdfdff2dec70117bf4d8 WatchSource:0}: Error finding container a4c95b412a537872f3886119aa35fbd786b35330e210fdfdff2dec70117bf4d8: Status 404 returned error can't find the container with id a4c95b412a537872f3886119aa35fbd786b35330e210fdfdff2dec70117bf4d8 Apr 16 14:37:54.430466 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:54.430428 2576 generic.go:358] "Generic (PLEG): container finished" podID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerID="f9dc13211b2ad9c9d39538e28432a238c619260cb744a45a0c0e27dfedbc02db" exitCode=0 Apr 16 14:37:54.430945 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:54.430523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" event={"ID":"0846db0a-e1c4-499b-bf67-bb24d6da069c","Type":"ContainerDied","Data":"f9dc13211b2ad9c9d39538e28432a238c619260cb744a45a0c0e27dfedbc02db"} Apr 16 14:37:54.430945 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:54.430567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" event={"ID":"0846db0a-e1c4-499b-bf67-bb24d6da069c","Type":"ContainerStarted","Data":"a4c95b412a537872f3886119aa35fbd786b35330e210fdfdff2dec70117bf4d8"} Apr 16 14:37:55.435254 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:55.435221 2576 generic.go:358] "Generic (PLEG): container finished" podID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerID="d4bebe97669893c5c664cfa6ecce1b90f1985bf6fd1d94703be21dada8e8cd87" exitCode=0 Apr 16 14:37:55.435647 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:55.435298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" event={"ID":"0846db0a-e1c4-499b-bf67-bb24d6da069c","Type":"ContainerDied","Data":"d4bebe97669893c5c664cfa6ecce1b90f1985bf6fd1d94703be21dada8e8cd87"} Apr 16 14:37:56.440677 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:56.440636 2576 generic.go:358] "Generic (PLEG): container finished" podID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerID="91d16dd0b90237ca793f1d4a664810764a3c68fd9a3e4089a3742d0302281632" exitCode=0 Apr 16 14:37:56.441111 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:56.440710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" event={"ID":"0846db0a-e1c4-499b-bf67-bb24d6da069c","Type":"ContainerDied","Data":"91d16dd0b90237ca793f1d4a664810764a3c68fd9a3e4089a3742d0302281632"} Apr 16 14:37:57.565158 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.565134 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:57.699508 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.699411 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdh9\" (UniqueName: \"kubernetes.io/projected/0846db0a-e1c4-499b-bf67-bb24d6da069c-kube-api-access-2pdh9\") pod \"0846db0a-e1c4-499b-bf67-bb24d6da069c\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " Apr 16 14:37:57.699508 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.699501 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-util\") pod \"0846db0a-e1c4-499b-bf67-bb24d6da069c\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " Apr 16 14:37:57.699777 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.699558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-bundle\") pod \"0846db0a-e1c4-499b-bf67-bb24d6da069c\" (UID: \"0846db0a-e1c4-499b-bf67-bb24d6da069c\") " Apr 16 14:37:57.700404 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.700358 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-bundle" (OuterVolumeSpecName: "bundle") pod "0846db0a-e1c4-499b-bf67-bb24d6da069c" (UID: "0846db0a-e1c4-499b-bf67-bb24d6da069c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:57.701610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.701587 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0846db0a-e1c4-499b-bf67-bb24d6da069c-kube-api-access-2pdh9" (OuterVolumeSpecName: "kube-api-access-2pdh9") pod "0846db0a-e1c4-499b-bf67-bb24d6da069c" (UID: "0846db0a-e1c4-499b-bf67-bb24d6da069c"). InnerVolumeSpecName "kube-api-access-2pdh9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:37:57.705103 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.705082 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-util" (OuterVolumeSpecName: "util") pod "0846db0a-e1c4-499b-bf67-bb24d6da069c" (UID: "0846db0a-e1c4-499b-bf67-bb24d6da069c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:57.800544 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.800500 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2pdh9\" (UniqueName: \"kubernetes.io/projected/0846db0a-e1c4-499b-bf67-bb24d6da069c-kube-api-access-2pdh9\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:57.800721 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.800563 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:57.800721 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:57.800574 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0846db0a-e1c4-499b-bf67-bb24d6da069c-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:37:58.448879 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:58.448844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" event={"ID":"0846db0a-e1c4-499b-bf67-bb24d6da069c","Type":"ContainerDied","Data":"a4c95b412a537872f3886119aa35fbd786b35330e210fdfdff2dec70117bf4d8"} Apr 16 14:37:58.448879 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:58.448870 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c59x6pd" Apr 16 14:37:58.448879 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:37:58.448880 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4c95b412a537872f3886119aa35fbd786b35330e210fdfdff2dec70117bf4d8" Apr 16 14:38:09.955755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.955719 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj"] Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956036 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="util" Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956047 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="util" Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956068 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="pull" Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956074 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="pull" Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956080 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="extract" Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956085 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="extract" Apr 16 14:38:09.956218 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.956154 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0846db0a-e1c4-499b-bf67-bb24d6da069c" containerName="extract" Apr 16 14:38:09.962948 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.962927 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:09.966893 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.966854 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:38:09.966893 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.966881 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:38:09.967093 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.966963 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ftngz\"" Apr 16 14:38:09.967782 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:09.967758 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj"] Apr 16 14:38:10.005486 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.005452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwktp\" (UniqueName: \"kubernetes.io/projected/e51d0efa-25de-4411-a19a-1cb76b172153-kube-api-access-kwktp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.005665 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.005493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.005665 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.005517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.106056 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.106016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.106252 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.106141 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwktp\" (UniqueName: \"kubernetes.io/projected/e51d0efa-25de-4411-a19a-1cb76b172153-kube-api-access-kwktp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.106252 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.106180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.106710 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.106573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.106710 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.106573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.115063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.115035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwktp\" (UniqueName: \"kubernetes.io/projected/e51d0efa-25de-4411-a19a-1cb76b172153-kube-api-access-kwktp\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.273134 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.273043 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:10.398249 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.398215 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj"] Apr 16 14:38:10.401646 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:38:10.401605 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51d0efa_25de_4411_a19a_1cb76b172153.slice/crio-beab87a1e4f39a7e665178ec7590b6cfa4307e4ab351cb1a13bf93ddf7c035c4 WatchSource:0}: Error finding container beab87a1e4f39a7e665178ec7590b6cfa4307e4ab351cb1a13bf93ddf7c035c4: Status 404 returned error can't find the container with id beab87a1e4f39a7e665178ec7590b6cfa4307e4ab351cb1a13bf93ddf7c035c4 Apr 16 14:38:10.491339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.491304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerStarted","Data":"25c0f93e5f2e13cb84a8994334928209d265bcbd02700cb1b27f9bceedb09707"} Apr 16 14:38:10.491339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.491344 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerStarted","Data":"beab87a1e4f39a7e665178ec7590b6cfa4307e4ab351cb1a13bf93ddf7c035c4"} Apr 16 14:38:10.649907 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.649865 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db"] Apr 16 14:38:10.653299 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.653277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.655985 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.655950 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 14:38:10.656143 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.655988 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 14:38:10.656143 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.656066 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-rhxrn\"" Apr 16 14:38:10.656271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.656141 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 14:38:10.656271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.656190 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 14:38:10.663430 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.663408 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db"] Apr 16 14:38:10.710935 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.710894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c50824db-3db3-4978-9e36-c8ee6ff1c646-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.710935 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.710941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdnc5\" (UniqueName: \"kubernetes.io/projected/c50824db-3db3-4978-9e36-c8ee6ff1c646-kube-api-access-qdnc5\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.711145 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.711002 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c50824db-3db3-4978-9e36-c8ee6ff1c646-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.811670 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.811632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c50824db-3db3-4978-9e36-c8ee6ff1c646-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.811849 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.811687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdnc5\" (UniqueName: \"kubernetes.io/projected/c50824db-3db3-4978-9e36-c8ee6ff1c646-kube-api-access-qdnc5\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.811849 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.811719 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c50824db-3db3-4978-9e36-c8ee6ff1c646-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.814216 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.814185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c50824db-3db3-4978-9e36-c8ee6ff1c646-webhook-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.814327 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.814214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c50824db-3db3-4978-9e36-c8ee6ff1c646-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.821963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.821940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdnc5\" (UniqueName: \"kubernetes.io/projected/c50824db-3db3-4978-9e36-c8ee6ff1c646-kube-api-access-qdnc5\") pod \"opendatahub-operator-controller-manager-6f7bb56bb6-j99db\" (UID: \"c50824db-3db3-4978-9e36-c8ee6ff1c646\") " pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:10.965100 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:10.965014 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:11.127896 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:11.127862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db"] Apr 16 14:38:11.131070 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:38:11.131037 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc50824db_3db3_4978_9e36_c8ee6ff1c646.slice/crio-90832a5556e7c4a5e50da38173941adefa554ca30dfbee67d4c7d7cd4456b061 WatchSource:0}: Error finding container 90832a5556e7c4a5e50da38173941adefa554ca30dfbee67d4c7d7cd4456b061: Status 404 returned error can't find the container with id 90832a5556e7c4a5e50da38173941adefa554ca30dfbee67d4c7d7cd4456b061 Apr 16 14:38:11.495645 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:11.495610 2576 generic.go:358] "Generic (PLEG): container finished" podID="e51d0efa-25de-4411-a19a-1cb76b172153" containerID="25c0f93e5f2e13cb84a8994334928209d265bcbd02700cb1b27f9bceedb09707" exitCode=0 Apr 16 14:38:11.495846 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:11.495683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerDied","Data":"25c0f93e5f2e13cb84a8994334928209d265bcbd02700cb1b27f9bceedb09707"} Apr 16 14:38:11.497062 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:11.497040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" event={"ID":"c50824db-3db3-4978-9e36-c8ee6ff1c646","Type":"ContainerStarted","Data":"90832a5556e7c4a5e50da38173941adefa554ca30dfbee67d4c7d7cd4456b061"} Apr 16 14:38:12.502936 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:12.502845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerStarted","Data":"394aac63016ab0fa44c00f44cff3ccb49e25ed374a0f600fa31a603c2c661da3"} Apr 16 14:38:13.507994 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:13.507960 2576 generic.go:358] "Generic (PLEG): container finished" podID="e51d0efa-25de-4411-a19a-1cb76b172153" containerID="394aac63016ab0fa44c00f44cff3ccb49e25ed374a0f600fa31a603c2c661da3" exitCode=0 Apr 16 14:38:13.508401 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:13.508023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerDied","Data":"394aac63016ab0fa44c00f44cff3ccb49e25ed374a0f600fa31a603c2c661da3"} Apr 16 14:38:14.513676 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:14.513642 2576 generic.go:358] "Generic (PLEG): container finished" podID="e51d0efa-25de-4411-a19a-1cb76b172153" containerID="998322826dcec71cc3f6dade955cd6cc85c11b88c0f0a200eaaf06773709bd5d" exitCode=0 Apr 16 14:38:14.514123 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:14.513730 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerDied","Data":"998322826dcec71cc3f6dade955cd6cc85c11b88c0f0a200eaaf06773709bd5d"} Apr 16 14:38:14.515097 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:14.515075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" event={"ID":"c50824db-3db3-4978-9e36-c8ee6ff1c646","Type":"ContainerStarted","Data":"cfb1c628e190436572a269b882e39a5f26e26be2d16693669e9eb466b76e9a10"} Apr 16 14:38:14.515189 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:14.515151 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:14.550559 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:14.550474 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" podStartSLOduration=2.131213298 podStartE2EDuration="4.550458072s" podCreationTimestamp="2026-04-16 14:38:10 +0000 UTC" firstStartedPulling="2026-04-16 14:38:11.132946115 +0000 UTC m=+506.959656513" lastFinishedPulling="2026-04-16 14:38:13.552190882 +0000 UTC m=+509.378901287" observedRunningTime="2026-04-16 14:38:14.549008528 +0000 UTC m=+510.375718946" watchObservedRunningTime="2026-04-16 14:38:14.550458072 +0000 UTC m=+510.377168486" Apr 16 14:38:15.649850 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.649819 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:15.753625 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.753589 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwktp\" (UniqueName: \"kubernetes.io/projected/e51d0efa-25de-4411-a19a-1cb76b172153-kube-api-access-kwktp\") pod \"e51d0efa-25de-4411-a19a-1cb76b172153\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " Apr 16 14:38:15.753796 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.753683 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-util\") pod \"e51d0efa-25de-4411-a19a-1cb76b172153\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " Apr 16 14:38:15.753796 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.753708 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-bundle\") pod \"e51d0efa-25de-4411-a19a-1cb76b172153\" (UID: \"e51d0efa-25de-4411-a19a-1cb76b172153\") " Apr 16 14:38:15.754618 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.754580 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-bundle" (OuterVolumeSpecName: "bundle") pod "e51d0efa-25de-4411-a19a-1cb76b172153" (UID: "e51d0efa-25de-4411-a19a-1cb76b172153"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:15.755687 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.755661 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e51d0efa-25de-4411-a19a-1cb76b172153-kube-api-access-kwktp" (OuterVolumeSpecName: "kube-api-access-kwktp") pod "e51d0efa-25de-4411-a19a-1cb76b172153" (UID: "e51d0efa-25de-4411-a19a-1cb76b172153"). InnerVolumeSpecName "kube-api-access-kwktp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:38:15.760054 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.760025 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-util" (OuterVolumeSpecName: "util") pod "e51d0efa-25de-4411-a19a-1cb76b172153" (UID: "e51d0efa-25de-4411-a19a-1cb76b172153"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:15.854885 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.854856 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:15.854885 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.854886 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kwktp\" (UniqueName: \"kubernetes.io/projected/e51d0efa-25de-4411-a19a-1cb76b172153-kube-api-access-kwktp\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:15.855079 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:15.854903 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e51d0efa-25de-4411-a19a-1cb76b172153-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:16.526835 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:16.526799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" event={"ID":"e51d0efa-25de-4411-a19a-1cb76b172153","Type":"ContainerDied","Data":"beab87a1e4f39a7e665178ec7590b6cfa4307e4ab351cb1a13bf93ddf7c035c4"} Apr 16 14:38:16.526835 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:16.526836 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beab87a1e4f39a7e665178ec7590b6cfa4307e4ab351cb1a13bf93ddf7c035c4" Apr 16 14:38:16.527054 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:16.526866 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9cjpqj" Apr 16 14:38:25.521699 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:25.521668 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6f7bb56bb6-j99db" Apr 16 14:38:29.392489 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.392444 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq"] Apr 16 14:38:29.393000 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.392974 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="util" Apr 16 14:38:29.393000 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.392994 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="util" Apr 16 14:38:29.393123 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.393014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="pull" Apr 16 14:38:29.393123 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.393022 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="pull" Apr 16 14:38:29.393123 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.393032 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="extract" Apr 16 14:38:29.393123 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.393042 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="extract" Apr 16 14:38:29.393304 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.393144 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e51d0efa-25de-4411-a19a-1cb76b172153" containerName="extract" Apr 16 14:38:29.397923 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.397899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.401063 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.401037 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 14:38:29.401204 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.401063 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-5mw66\"" Apr 16 14:38:29.401204 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.401070 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 14:38:29.404338 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.404312 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq"] Apr 16 14:38:29.468234 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.468197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d568b9f7-2358-4b4c-8148-a6a3380124ab-tls-certs\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.468439 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.468330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d568b9f7-2358-4b4c-8148-a6a3380124ab-tmp\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.468439 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.468397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwcz\" (UniqueName: \"kubernetes.io/projected/d568b9f7-2358-4b4c-8148-a6a3380124ab-kube-api-access-2wwcz\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.569647 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.569608 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d568b9f7-2358-4b4c-8148-a6a3380124ab-tmp\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.569647 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.569654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwcz\" (UniqueName: \"kubernetes.io/projected/d568b9f7-2358-4b4c-8148-a6a3380124ab-kube-api-access-2wwcz\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.569927 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.569681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d568b9f7-2358-4b4c-8148-a6a3380124ab-tls-certs\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.572214 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.572187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d568b9f7-2358-4b4c-8148-a6a3380124ab-tmp\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.572364 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.572343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d568b9f7-2358-4b4c-8148-a6a3380124ab-tls-certs\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.579281 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.579253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwcz\" (UniqueName: \"kubernetes.io/projected/d568b9f7-2358-4b4c-8148-a6a3380124ab-kube-api-access-2wwcz\") pod \"kube-auth-proxy-6468c4fc75-pjxlq\" (UID: \"d568b9f7-2358-4b4c-8148-a6a3380124ab\") " pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.709312 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.709226 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" Apr 16 14:38:29.838499 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:29.838470 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq"] Apr 16 14:38:29.841172 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:38:29.841142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd568b9f7_2358_4b4c_8148_a6a3380124ab.slice/crio-81ee28da7899151b9ea600b1aa2067bb6322ae3710a6218feaf828b8f06a4789 WatchSource:0}: Error finding container 81ee28da7899151b9ea600b1aa2067bb6322ae3710a6218feaf828b8f06a4789: Status 404 returned error can't find the container with id 81ee28da7899151b9ea600b1aa2067bb6322ae3710a6218feaf828b8f06a4789 Apr 16 14:38:30.578762 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:30.578711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" event={"ID":"d568b9f7-2358-4b4c-8148-a6a3380124ab","Type":"ContainerStarted","Data":"81ee28da7899151b9ea600b1aa2067bb6322ae3710a6218feaf828b8f06a4789"} Apr 16 14:38:31.361357 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.361313 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96"] Apr 16 14:38:31.371065 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.371031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.376034 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.376002 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:38:31.376209 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.376122 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ftngz\"" Apr 16 14:38:31.380798 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.380771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:38:31.388269 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.388216 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96"] Apr 16 14:38:31.478439 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.478400 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6jdtr"] Apr 16 14:38:31.483575 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.483522 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:31.486553 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.486396 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 14:38:31.486553 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.486414 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-c6tj9\"" Apr 16 14:38:31.489496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.489343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.489496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.489391 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.489717 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.489563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrxc\" (UniqueName: \"kubernetes.io/projected/139ce253-6440-4983-afbd-1e1507899439-kube-api-access-jsrxc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.492215 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.491920 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6jdtr"] Apr 16 14:38:31.590715 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.590674 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.591184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.590730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.591184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.590816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a2890cc-c938-4ec0-ac01-7eea814cd68f-cert\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:31.591184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.590862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrxc\" (UniqueName: \"kubernetes.io/projected/139ce253-6440-4983-afbd-1e1507899439-kube-api-access-jsrxc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.591184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.590893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zddfs\" (UniqueName: \"kubernetes.io/projected/8a2890cc-c938-4ec0-ac01-7eea814cd68f-kube-api-access-zddfs\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:31.591184 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.591162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.591433 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.591206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.601267 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.601235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrxc\" (UniqueName: \"kubernetes.io/projected/139ce253-6440-4983-afbd-1e1507899439-kube-api-access-jsrxc\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.684881 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.684798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:31.692396 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.692367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a2890cc-c938-4ec0-ac01-7eea814cd68f-cert\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:31.692519 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.692409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zddfs\" (UniqueName: \"kubernetes.io/projected/8a2890cc-c938-4ec0-ac01-7eea814cd68f-kube-api-access-zddfs\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:31.692607 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:38:31.692527 2576 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 14:38:31.692675 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:38:31.692618 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2890cc-c938-4ec0-ac01-7eea814cd68f-cert podName:8a2890cc-c938-4ec0-ac01-7eea814cd68f nodeName:}" failed. No retries permitted until 2026-04-16 14:38:32.192599943 +0000 UTC m=+528.019310341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a2890cc-c938-4ec0-ac01-7eea814cd68f-cert") pod "odh-model-controller-858dbf95b8-6jdtr" (UID: "8a2890cc-c938-4ec0-ac01-7eea814cd68f") : secret "odh-model-controller-webhook-cert" not found Apr 16 14:38:31.702129 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.702097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zddfs\" (UniqueName: \"kubernetes.io/projected/8a2890cc-c938-4ec0-ac01-7eea814cd68f-kube-api-access-zddfs\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:31.982008 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:31.981986 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96"] Apr 16 14:38:31.984100 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:38:31.984067 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139ce253_6440_4983_afbd_1e1507899439.slice/crio-225376f33409817bbfbf556275dda960a61c30ba0db6e5b8fe57b6facb424210 WatchSource:0}: Error finding container 225376f33409817bbfbf556275dda960a61c30ba0db6e5b8fe57b6facb424210: Status 404 returned error can't find the container with id 225376f33409817bbfbf556275dda960a61c30ba0db6e5b8fe57b6facb424210 Apr 16 14:38:32.197178 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.197101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a2890cc-c938-4ec0-ac01-7eea814cd68f-cert\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:32.199939 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.199885 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a2890cc-c938-4ec0-ac01-7eea814cd68f-cert\") pod \"odh-model-controller-858dbf95b8-6jdtr\" (UID: \"8a2890cc-c938-4ec0-ac01-7eea814cd68f\") " pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:32.397386 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.397344 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:32.589947 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.589884 2576 generic.go:358] "Generic (PLEG): container finished" podID="139ce253-6440-4983-afbd-1e1507899439" containerID="04e31a26d178e2006b32151be5ee5c897b32564407f4eaefc405949d95edd647" exitCode=0 Apr 16 14:38:32.589947 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.589937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" event={"ID":"139ce253-6440-4983-afbd-1e1507899439","Type":"ContainerDied","Data":"04e31a26d178e2006b32151be5ee5c897b32564407f4eaefc405949d95edd647"} Apr 16 14:38:32.590204 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.589983 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" event={"ID":"139ce253-6440-4983-afbd-1e1507899439","Type":"ContainerStarted","Data":"225376f33409817bbfbf556275dda960a61c30ba0db6e5b8fe57b6facb424210"} Apr 16 14:38:32.736987 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.736874 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7"] Apr 16 14:38:32.742800 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.742767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.746439 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.746380 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 14:38:32.746798 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.746688 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kd2cq\"" Apr 16 14:38:32.746931 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.746861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 14:38:32.746931 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.746911 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 14:38:32.750964 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.750580 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7"] Apr 16 14:38:32.802092 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.802055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcae517d-7d7b-479d-a8fc-97e5bc02022b-cert\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.802244 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.802125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zc4z\" (UniqueName: \"kubernetes.io/projected/fcae517d-7d7b-479d-a8fc-97e5bc02022b-kube-api-access-4zc4z\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.802244 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.802166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fcae517d-7d7b-479d-a8fc-97e5bc02022b-manager-config\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.802244 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.802229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcae517d-7d7b-479d-a8fc-97e5bc02022b-metrics-cert\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.903377 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.903332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fcae517d-7d7b-479d-a8fc-97e5bc02022b-manager-config\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.903570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.903402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcae517d-7d7b-479d-a8fc-97e5bc02022b-metrics-cert\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.903570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.903459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcae517d-7d7b-479d-a8fc-97e5bc02022b-cert\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.903570 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.903527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zc4z\" (UniqueName: \"kubernetes.io/projected/fcae517d-7d7b-479d-a8fc-97e5bc02022b-kube-api-access-4zc4z\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.904217 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.904191 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fcae517d-7d7b-479d-a8fc-97e5bc02022b-manager-config\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.906325 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.906303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fcae517d-7d7b-479d-a8fc-97e5bc02022b-cert\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.906504 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.906481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fcae517d-7d7b-479d-a8fc-97e5bc02022b-metrics-cert\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:32.914550 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:32.914514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zc4z\" (UniqueName: \"kubernetes.io/projected/fcae517d-7d7b-479d-a8fc-97e5bc02022b-kube-api-access-4zc4z\") pod \"lws-controller-manager-76cd85c697-vngk7\" (UID: \"fcae517d-7d7b-479d-a8fc-97e5bc02022b\") " pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:33.055939 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.055900 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:33.152109 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.152015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-6jdtr"] Apr 16 14:38:33.155449 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:38:33.155180 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2890cc_c938_4ec0_ac01_7eea814cd68f.slice/crio-7f93044c357e28e65cb2272498ff0992832ce216dbdf17ea294830be6d8515e7 WatchSource:0}: Error finding container 7f93044c357e28e65cb2272498ff0992832ce216dbdf17ea294830be6d8515e7: Status 404 returned error can't find the container with id 7f93044c357e28e65cb2272498ff0992832ce216dbdf17ea294830be6d8515e7 Apr 16 14:38:33.215835 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.210667 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7"] Apr 16 14:38:33.596372 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.596267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" event={"ID":"d568b9f7-2358-4b4c-8148-a6a3380124ab","Type":"ContainerStarted","Data":"150ee7440e398a5febeef424e36f7198442e6ea4b3901901a024b413c8099009"} Apr 16 14:38:33.597668 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.597628 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" event={"ID":"fcae517d-7d7b-479d-a8fc-97e5bc02022b","Type":"ContainerStarted","Data":"39d880125463e6a91d4f9b45e2df78b9e3f7f26d3790535d964398056e21ed60"} Apr 16 14:38:33.599023 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.598988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" event={"ID":"8a2890cc-c938-4ec0-ac01-7eea814cd68f","Type":"ContainerStarted","Data":"7f93044c357e28e65cb2272498ff0992832ce216dbdf17ea294830be6d8515e7"} Apr 16 14:38:33.615051 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:33.614976 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6468c4fc75-pjxlq" podStartSLOduration=1.384563148 podStartE2EDuration="4.614955014s" podCreationTimestamp="2026-04-16 14:38:29 +0000 UTC" firstStartedPulling="2026-04-16 14:38:29.843335027 +0000 UTC m=+525.670045422" lastFinishedPulling="2026-04-16 14:38:33.073726893 +0000 UTC m=+528.900437288" observedRunningTime="2026-04-16 14:38:33.614652357 +0000 UTC m=+529.441362770" watchObservedRunningTime="2026-04-16 14:38:33.614955014 +0000 UTC m=+529.441665433" Apr 16 14:38:34.607038 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:34.606997 2576 generic.go:358] "Generic (PLEG): container finished" podID="139ce253-6440-4983-afbd-1e1507899439" containerID="a8bc7e45b4b71defd613b631438a35805b183128f2e8137dcf0339fba522fc9d" exitCode=0 Apr 16 14:38:34.607513 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:34.607106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" event={"ID":"139ce253-6440-4983-afbd-1e1507899439","Type":"ContainerDied","Data":"a8bc7e45b4b71defd613b631438a35805b183128f2e8137dcf0339fba522fc9d"} Apr 16 14:38:36.615733 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.615698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" event={"ID":"fcae517d-7d7b-479d-a8fc-97e5bc02022b","Type":"ContainerStarted","Data":"38d09c1aa7ce72825b17ccc9b77e665cbe55364542f0d54c6b095af13be1ceef"} Apr 16 14:38:36.616172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.615768 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:36.617737 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.617715 2576 generic.go:358] "Generic (PLEG): container finished" podID="139ce253-6440-4983-afbd-1e1507899439" containerID="a805156c486b499d4b620e0cf31719ea2670cc9be822112b2b04ab00f05df101" exitCode=0 Apr 16 14:38:36.617811 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.617778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" event={"ID":"139ce253-6440-4983-afbd-1e1507899439","Type":"ContainerDied","Data":"a805156c486b499d4b620e0cf31719ea2670cc9be822112b2b04ab00f05df101"} Apr 16 14:38:36.619004 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.618982 2576 generic.go:358] "Generic (PLEG): container finished" podID="8a2890cc-c938-4ec0-ac01-7eea814cd68f" containerID="76f1704b2299c26cf56a7b90dd2217429868415005a63487dd64efd07f979909" exitCode=1 Apr 16 14:38:36.619079 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.619060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" event={"ID":"8a2890cc-c938-4ec0-ac01-7eea814cd68f","Type":"ContainerDied","Data":"76f1704b2299c26cf56a7b90dd2217429868415005a63487dd64efd07f979909"} Apr 16 14:38:36.619220 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.619207 2576 scope.go:117] "RemoveContainer" containerID="76f1704b2299c26cf56a7b90dd2217429868415005a63487dd64efd07f979909" Apr 16 14:38:36.645946 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:36.645894 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" podStartSLOduration=1.917881977 podStartE2EDuration="4.645877128s" podCreationTimestamp="2026-04-16 14:38:32 +0000 UTC" firstStartedPulling="2026-04-16 14:38:33.217546516 +0000 UTC m=+529.044256911" lastFinishedPulling="2026-04-16 14:38:35.945541652 +0000 UTC m=+531.772252062" observedRunningTime="2026-04-16 14:38:36.644275871 +0000 UTC m=+532.470986323" watchObservedRunningTime="2026-04-16 14:38:36.645877128 +0000 UTC m=+532.472587548" Apr 16 14:38:37.624877 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.624840 2576 generic.go:358] "Generic (PLEG): container finished" podID="8a2890cc-c938-4ec0-ac01-7eea814cd68f" containerID="880c5397c66cfee92b4039c79b7951fcae5853a3d6dbe7a6d39537f7f36863b4" exitCode=1 Apr 16 14:38:37.625306 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.624921 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" event={"ID":"8a2890cc-c938-4ec0-ac01-7eea814cd68f","Type":"ContainerDied","Data":"880c5397c66cfee92b4039c79b7951fcae5853a3d6dbe7a6d39537f7f36863b4"} Apr 16 14:38:37.625306 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.624963 2576 scope.go:117] "RemoveContainer" containerID="76f1704b2299c26cf56a7b90dd2217429868415005a63487dd64efd07f979909" Apr 16 14:38:37.625306 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.625201 2576 scope.go:117] "RemoveContainer" containerID="880c5397c66cfee92b4039c79b7951fcae5853a3d6dbe7a6d39537f7f36863b4" Apr 16 14:38:37.625853 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:38:37.625830 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6jdtr_opendatahub(8a2890cc-c938-4ec0-ac01-7eea814cd68f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" podUID="8a2890cc-c938-4ec0-ac01-7eea814cd68f" Apr 16 14:38:37.763610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.763583 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:37.851632 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.851599 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-bundle\") pod \"139ce253-6440-4983-afbd-1e1507899439\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " Apr 16 14:38:37.851806 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.851649 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-util\") pod \"139ce253-6440-4983-afbd-1e1507899439\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " Apr 16 14:38:37.851806 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.851670 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrxc\" (UniqueName: \"kubernetes.io/projected/139ce253-6440-4983-afbd-1e1507899439-kube-api-access-jsrxc\") pod \"139ce253-6440-4983-afbd-1e1507899439\" (UID: \"139ce253-6440-4983-afbd-1e1507899439\") " Apr 16 14:38:37.852580 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.852520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-bundle" (OuterVolumeSpecName: "bundle") pod "139ce253-6440-4983-afbd-1e1507899439" (UID: "139ce253-6440-4983-afbd-1e1507899439"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:37.853870 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.853846 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139ce253-6440-4983-afbd-1e1507899439-kube-api-access-jsrxc" (OuterVolumeSpecName: "kube-api-access-jsrxc") pod "139ce253-6440-4983-afbd-1e1507899439" (UID: "139ce253-6440-4983-afbd-1e1507899439"). InnerVolumeSpecName "kube-api-access-jsrxc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:38:37.856898 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.856869 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-util" (OuterVolumeSpecName: "util") pod "139ce253-6440-4983-afbd-1e1507899439" (UID: "139ce253-6440-4983-afbd-1e1507899439"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:37.952714 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.952622 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:37.952714 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.952652 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/139ce253-6440-4983-afbd-1e1507899439-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:37.952714 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:37.952665 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jsrxc\" (UniqueName: \"kubernetes.io/projected/139ce253-6440-4983-afbd-1e1507899439-kube-api-access-jsrxc\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:38.630825 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:38.630782 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" event={"ID":"139ce253-6440-4983-afbd-1e1507899439","Type":"ContainerDied","Data":"225376f33409817bbfbf556275dda960a61c30ba0db6e5b8fe57b6facb424210"} Apr 16 14:38:38.631240 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:38.630829 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="225376f33409817bbfbf556275dda960a61c30ba0db6e5b8fe57b6facb424210" Apr 16 14:38:38.631240 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:38.630806 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356zh96" Apr 16 14:38:38.632513 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:38.632489 2576 scope.go:117] "RemoveContainer" containerID="880c5397c66cfee92b4039c79b7951fcae5853a3d6dbe7a6d39537f7f36863b4" Apr 16 14:38:38.632759 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:38:38.632740 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6jdtr_opendatahub(8a2890cc-c938-4ec0-ac01-7eea814cd68f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" podUID="8a2890cc-c938-4ec0-ac01-7eea814cd68f" Apr 16 14:38:42.397810 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:42.397768 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:42.398312 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:42.398135 2576 scope.go:117] "RemoveContainer" containerID="880c5397c66cfee92b4039c79b7951fcae5853a3d6dbe7a6d39537f7f36863b4" Apr 16 14:38:42.398363 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:38:42.398344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-6jdtr_opendatahub(8a2890cc-c938-4ec0-ac01-7eea814cd68f)\"" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" podUID="8a2890cc-c938-4ec0-ac01-7eea814cd68f" Apr 16 14:38:45.687111 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687069 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg"] Apr 16 14:38:45.687487 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687410 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="pull" Apr 16 14:38:45.687487 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687421 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="pull" Apr 16 14:38:45.687487 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687437 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="util" Apr 16 14:38:45.687487 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687444 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="util" Apr 16 14:38:45.687487 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687460 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="extract" Apr 16 14:38:45.687487 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687467 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="extract" Apr 16 14:38:45.687714 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.687527 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="139ce253-6440-4983-afbd-1e1507899439" containerName="extract" Apr 16 14:38:45.693542 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.693505 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.696483 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.696458 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ftngz\"" Apr 16 14:38:45.696641 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.696458 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 14:38:45.697040 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.697026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 14:38:45.702274 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.702251 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg"] Apr 16 14:38:45.825118 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.825075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmvg\" (UniqueName: \"kubernetes.io/projected/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-kube-api-access-6bmvg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.825309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.825197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.825309 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.825245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.926030 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.925986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmvg\" (UniqueName: \"kubernetes.io/projected/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-kube-api-access-6bmvg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.926210 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.926064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.926210 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.926094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.926430 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.926411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.926467 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.926437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:45.942160 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:45.942080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmvg\" (UniqueName: \"kubernetes.io/projected/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-kube-api-access-6bmvg\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:46.004394 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:46.004355 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:46.147707 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:46.147665 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg"] Apr 16 14:38:46.150703 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:38:46.150674 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d78c0ae_dffe_41f9_b81f_c9d77f3958a2.slice/crio-6c0c1d2c38f173f8f70320a1056a69523369be54f94e25f467a34c125ba40c55 WatchSource:0}: Error finding container 6c0c1d2c38f173f8f70320a1056a69523369be54f94e25f467a34c125ba40c55: Status 404 returned error can't find the container with id 6c0c1d2c38f173f8f70320a1056a69523369be54f94e25f467a34c125ba40c55 Apr 16 14:38:46.660508 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:46.660468 2576 generic.go:358] "Generic (PLEG): container finished" podID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerID="8e7fae42b76068dd03f35b42e314cc73138ccb510500eba7e042878e90b9037d" exitCode=0 Apr 16 14:38:46.660704 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:46.660528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" event={"ID":"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2","Type":"ContainerDied","Data":"8e7fae42b76068dd03f35b42e314cc73138ccb510500eba7e042878e90b9037d"} Apr 16 14:38:46.660704 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:46.660573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" event={"ID":"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2","Type":"ContainerStarted","Data":"6c0c1d2c38f173f8f70320a1056a69523369be54f94e25f467a34c125ba40c55"} Apr 16 14:38:47.628080 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:47.628050 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-76cd85c697-vngk7" Apr 16 14:38:48.670840 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:48.670806 2576 generic.go:358] "Generic (PLEG): container finished" podID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerID="d9bd63aa0f3587badd91681d5580913009adc886416ef45577fdc163d38ff401" exitCode=0 Apr 16 14:38:48.671221 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:48.670846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" event={"ID":"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2","Type":"ContainerDied","Data":"d9bd63aa0f3587badd91681d5580913009adc886416ef45577fdc163d38ff401"} Apr 16 14:38:49.677578 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:49.677527 2576 generic.go:358] "Generic (PLEG): container finished" podID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerID="236790ab61e475170b147c2a83ba71865e87f6dcc24b52bd41f303b5472bc7a5" exitCode=0 Apr 16 14:38:49.677578 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:49.677558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" event={"ID":"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2","Type":"ContainerDied","Data":"236790ab61e475170b147c2a83ba71865e87f6dcc24b52bd41f303b5472bc7a5"} Apr 16 14:38:50.806198 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.806172 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:50.972446 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.972356 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-bundle\") pod \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " Apr 16 14:38:50.972658 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.972445 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bmvg\" (UniqueName: \"kubernetes.io/projected/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-kube-api-access-6bmvg\") pod \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " Apr 16 14:38:50.972658 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.972490 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-util\") pod \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\" (UID: \"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2\") " Apr 16 14:38:50.973358 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.973329 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-bundle" (OuterVolumeSpecName: "bundle") pod "9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" (UID: "9d78c0ae-dffe-41f9-b81f-c9d77f3958a2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:50.974607 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.974574 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-kube-api-access-6bmvg" (OuterVolumeSpecName: "kube-api-access-6bmvg") pod "9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" (UID: "9d78c0ae-dffe-41f9-b81f-c9d77f3958a2"). InnerVolumeSpecName "kube-api-access-6bmvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:38:50.977978 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:50.977957 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-util" (OuterVolumeSpecName: "util") pod "9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" (UID: "9d78c0ae-dffe-41f9-b81f-c9d77f3958a2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:51.074156 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:51.074121 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:51.074156 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:51.074150 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:51.074156 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:51.074162 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6bmvg\" (UniqueName: \"kubernetes.io/projected/9d78c0ae-dffe-41f9-b81f-c9d77f3958a2-kube-api-access-6bmvg\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:38:51.690787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:51.690750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" event={"ID":"9d78c0ae-dffe-41f9-b81f-c9d77f3958a2","Type":"ContainerDied","Data":"6c0c1d2c38f173f8f70320a1056a69523369be54f94e25f467a34c125ba40c55"} Apr 16 14:38:51.690787 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:51.690783 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0c1d2c38f173f8f70320a1056a69523369be54f94e25f467a34c125ba40c55" Apr 16 14:38:51.691037 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:51.690817 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2gxwfg" Apr 16 14:38:52.398224 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:52.398178 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:52.398646 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:52.398631 2576 scope.go:117] "RemoveContainer" containerID="880c5397c66cfee92b4039c79b7951fcae5853a3d6dbe7a6d39537f7f36863b4" Apr 16 14:38:53.699176 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:53.699139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" event={"ID":"8a2890cc-c938-4ec0-ac01-7eea814cd68f","Type":"ContainerStarted","Data":"8f9418dc47fe6cc402986d98ec39216c44b39eba64cd9320b31f0254e68d3892"} Apr 16 14:38:53.699617 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:53.699335 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:38:53.726287 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:38:53.726221 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" podStartSLOduration=3.19615892 podStartE2EDuration="22.72619894s" podCreationTimestamp="2026-04-16 14:38:31 +0000 UTC" firstStartedPulling="2026-04-16 14:38:33.157089856 +0000 UTC m=+528.983800263" lastFinishedPulling="2026-04-16 14:38:52.687129884 +0000 UTC m=+548.513840283" observedRunningTime="2026-04-16 14:38:53.7256674 +0000 UTC m=+549.552377841" watchObservedRunningTime="2026-04-16 14:38:53.72619894 +0000 UTC m=+549.552909393" Apr 16 14:39:04.706003 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:04.705972 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-6jdtr" Apr 16 14:39:14.949242 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:14.949207 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6478cddcdb-dwk27"] Apr 16 14:39:39.969595 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:39.969554 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6478cddcdb-dwk27" podUID="45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" containerName="console" containerID="cri-o://0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee" gracePeriod=15 Apr 16 14:39:40.208296 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.208273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6478cddcdb-dwk27_45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8/console/0.log" Apr 16 14:39:40.208425 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.208332 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:39:40.275882 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.275796 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-trusted-ca-bundle\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.275882 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.275850 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-oauth-config\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.275882 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.275880 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-oauth-serving-cert\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.276164 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.275910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-service-ca\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.276164 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.275957 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hsp\" (UniqueName: \"kubernetes.io/projected/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-kube-api-access-78hsp\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.276164 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.275980 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-serving-cert\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.276164 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.276003 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-config\") pod \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\" (UID: \"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8\") " Apr 16 14:39:40.276366 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.276317 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:39:40.276366 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.276331 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:39:40.276499 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.276463 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-service-ca" (OuterVolumeSpecName: "service-ca") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:39:40.276632 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.276519 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-config" (OuterVolumeSpecName: "console-config") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:39:40.278141 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.278110 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:39:40.278141 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.278127 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:39:40.278284 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.278187 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-kube-api-access-78hsp" (OuterVolumeSpecName: "kube-api-access-78hsp") pod "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" (UID: "45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8"). InnerVolumeSpecName "kube-api-access-78hsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:40.377601 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377564 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-trusted-ca-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.377601 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377594 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-oauth-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.377601 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377604 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-oauth-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.377601 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377614 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-service-ca\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.377857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377623 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78hsp\" (UniqueName: \"kubernetes.io/projected/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-kube-api-access-78hsp\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.377857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377633 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-serving-cert\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.377857 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.377641 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8-console-config\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:40.878180 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.878152 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6478cddcdb-dwk27_45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8/console/0.log" Apr 16 14:39:40.878345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.878192 2576 generic.go:358] "Generic (PLEG): container finished" podID="45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" containerID="0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee" exitCode=2 Apr 16 14:39:40.878345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.878281 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6478cddcdb-dwk27" Apr 16 14:39:40.878345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.878280 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6478cddcdb-dwk27" event={"ID":"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8","Type":"ContainerDied","Data":"0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee"} Apr 16 14:39:40.878345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.878322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6478cddcdb-dwk27" event={"ID":"45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8","Type":"ContainerDied","Data":"380fcdfd672f11635eb050306de282b4bc5a0066a4f050e54a801d8c17153003"} Apr 16 14:39:40.878345 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.878338 2576 scope.go:117] "RemoveContainer" containerID="0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee" Apr 16 14:39:40.887983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.887963 2576 scope.go:117] "RemoveContainer" containerID="0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee" Apr 16 14:39:40.888265 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:39:40.888245 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee\": container with ID starting with 0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee not found: ID does not exist" containerID="0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee" Apr 16 14:39:40.888367 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.888270 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee"} err="failed to get container status \"0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee\": rpc error: code = NotFound desc = could not find container \"0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee\": container with ID starting with 0e81418e0410647c5481bf66519d3dc4d81a920ae0003ddf5d0fc926d49984ee not found: ID does not exist" Apr 16 14:39:40.905492 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.905461 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6478cddcdb-dwk27"] Apr 16 14:39:40.915844 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:40.915814 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6478cddcdb-dwk27"] Apr 16 14:39:42.781611 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:42.781577 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" path="/var/lib/kubelet/pods/45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8/volumes" Apr 16 14:39:44.672156 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.672130 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:39:44.672607 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.672433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:39:44.773780 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.773746 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n"] Apr 16 14:39:44.774086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774074 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="util" Apr 16 14:39:44.774086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774087 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="util" Apr 16 14:39:44.774172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774117 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" containerName="console" Apr 16 14:39:44.774172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774123 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" containerName="console" Apr 16 14:39:44.774172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774147 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="pull" Apr 16 14:39:44.774172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774153 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="pull" Apr 16 14:39:44.774172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774161 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="extract" Apr 16 14:39:44.774172 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774168 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="extract" Apr 16 14:39:44.774369 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774214 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="45d27a9c-0fdf-4aba-a99b-14f1baf7c2e8" containerName="console" Apr 16 14:39:44.774369 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.774226 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d78c0ae-dffe-41f9-b81f-c9d77f3958a2" containerName="extract" Apr 16 14:39:44.779644 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.779620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:44.782302 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.782280 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:39:44.782422 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.782281 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:39:44.783208 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.783188 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-r2tmc\"" Apr 16 14:39:44.791610 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.789524 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n"] Apr 16 14:39:44.914028 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.913987 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:44.914214 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.914031 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:44.914214 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:44.914061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qfcm\" (UniqueName: \"kubernetes.io/projected/edf5b29e-5d29-47f7-9873-699371d62d3c-kube-api-access-9qfcm\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.015574 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.015442 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.015574 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.015489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.015574 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.015512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qfcm\" (UniqueName: \"kubernetes.io/projected/edf5b29e-5d29-47f7-9873-699371d62d3c-kube-api-access-9qfcm\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.015903 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.015878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.015966 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.015947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.026675 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.026643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qfcm\" (UniqueName: \"kubernetes.io/projected/edf5b29e-5d29-47f7-9873-699371d62d3c-kube-api-access-9qfcm\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.091931 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.091892 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:45.171965 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.171566 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7"] Apr 16 14:39:45.177576 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.177546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.182033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.182004 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7"] Apr 16 14:39:45.256561 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.256508 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n"] Apr 16 14:39:45.258028 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:39:45.258002 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedf5b29e_5d29_47f7_9873_699371d62d3c.slice/crio-ff3822d8c183ce883812e1c3c659d76fc5e9a1e2040081ea77ea9c0b7dbcc6d9 WatchSource:0}: Error finding container ff3822d8c183ce883812e1c3c659d76fc5e9a1e2040081ea77ea9c0b7dbcc6d9: Status 404 returned error can't find the container with id ff3822d8c183ce883812e1c3c659d76fc5e9a1e2040081ea77ea9c0b7dbcc6d9 Apr 16 14:39:45.318583 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.318558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.318716 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.318658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.318833 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.318734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlt9c\" (UniqueName: \"kubernetes.io/projected/43fc5738-3bf5-475d-857f-1f781f2e55a6-kube-api-access-zlt9c\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.419527 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.419492 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.419734 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.419583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlt9c\" (UniqueName: \"kubernetes.io/projected/43fc5738-3bf5-475d-857f-1f781f2e55a6-kube-api-access-zlt9c\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.419734 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.419616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.419886 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.419865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.419944 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.419887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.429524 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.429500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlt9c\" (UniqueName: \"kubernetes.io/projected/43fc5738-3bf5-475d-857f-1f781f2e55a6-kube-api-access-zlt9c\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.490557 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.490502 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:45.571856 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.571812 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj"] Apr 16 14:39:45.576743 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.576717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.583970 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.583934 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj"] Apr 16 14:39:45.616643 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.616618 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7"] Apr 16 14:39:45.617807 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:39:45.617784 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43fc5738_3bf5_475d_857f_1f781f2e55a6.slice/crio-b8aae9bb88bf4ca55c3275b833882c77cc20766a7f99ffbdbac736fd62bf9161 WatchSource:0}: Error finding container b8aae9bb88bf4ca55c3275b833882c77cc20766a7f99ffbdbac736fd62bf9161: Status 404 returned error can't find the container with id b8aae9bb88bf4ca55c3275b833882c77cc20766a7f99ffbdbac736fd62bf9161 Apr 16 14:39:45.721700 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.721668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.722086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.721758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.722086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.721800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8r94\" (UniqueName: \"kubernetes.io/projected/2e6fd606-91a0-489e-967d-379339d856e8-kube-api-access-w8r94\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.822974 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.822884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.822974 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.822955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.823163 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.822993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8r94\" (UniqueName: \"kubernetes.io/projected/2e6fd606-91a0-489e-967d-379339d856e8-kube-api-access-w8r94\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.823336 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.823314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.823383 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.823362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.832863 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.832838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8r94\" (UniqueName: \"kubernetes.io/projected/2e6fd606-91a0-489e-967d-379339d856e8-kube-api-access-w8r94\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.889778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.889735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:45.899434 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.899405 2576 generic.go:358] "Generic (PLEG): container finished" podID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerID="5c9101d648ca55719b042f884e38a57d5e9c0d737cf559136dd6d67e4efed32b" exitCode=0 Apr 16 14:39:45.899834 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.899465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" event={"ID":"43fc5738-3bf5-475d-857f-1f781f2e55a6","Type":"ContainerDied","Data":"5c9101d648ca55719b042f884e38a57d5e9c0d737cf559136dd6d67e4efed32b"} Apr 16 14:39:45.899834 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.899487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" event={"ID":"43fc5738-3bf5-475d-857f-1f781f2e55a6","Type":"ContainerStarted","Data":"b8aae9bb88bf4ca55c3275b833882c77cc20766a7f99ffbdbac736fd62bf9161"} Apr 16 14:39:45.901087 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.901062 2576 generic.go:358] "Generic (PLEG): container finished" podID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerID="0bcb2181f94e0701c4e1b724f20474b64cbe11b9fbf2dbb6c04273d1f6a121ab" exitCode=0 Apr 16 14:39:45.901177 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.901135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" event={"ID":"edf5b29e-5d29-47f7-9873-699371d62d3c","Type":"ContainerDied","Data":"0bcb2181f94e0701c4e1b724f20474b64cbe11b9fbf2dbb6c04273d1f6a121ab"} Apr 16 14:39:45.901177 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.901164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" event={"ID":"edf5b29e-5d29-47f7-9873-699371d62d3c","Type":"ContainerStarted","Data":"ff3822d8c183ce883812e1c3c659d76fc5e9a1e2040081ea77ea9c0b7dbcc6d9"} Apr 16 14:39:45.975952 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.975920 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h"] Apr 16 14:39:45.980637 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.980613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:45.987226 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:45.987197 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h"] Apr 16 14:39:46.023552 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.023511 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj"] Apr 16 14:39:46.025317 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:39:46.025290 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e6fd606_91a0_489e_967d_379339d856e8.slice/crio-344b91c653e4783295c13bbb68c71f59633e903d89b9281d385dee980962fe01 WatchSource:0}: Error finding container 344b91c653e4783295c13bbb68c71f59633e903d89b9281d385dee980962fe01: Status 404 returned error can't find the container with id 344b91c653e4783295c13bbb68c71f59633e903d89b9281d385dee980962fe01 Apr 16 14:39:46.126715 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.126678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.126842 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.126782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.126842 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.126836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn5ls\" (UniqueName: \"kubernetes.io/projected/375b213e-688f-4f46-83df-921c9b71892d-kube-api-access-vn5ls\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.228053 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.228020 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.228254 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.228063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn5ls\" (UniqueName: \"kubernetes.io/projected/375b213e-688f-4f46-83df-921c9b71892d-kube-api-access-vn5ls\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.228254 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.228132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.228479 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.228455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.228581 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.228463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.237962 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.237927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn5ls\" (UniqueName: \"kubernetes.io/projected/375b213e-688f-4f46-83df-921c9b71892d-kube-api-access-vn5ls\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.293185 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.293147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:46.419036 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.419008 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h"] Apr 16 14:39:46.420231 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:39:46.420201 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod375b213e_688f_4f46_83df_921c9b71892d.slice/crio-a24c67882606042f6f7267f5a9158bb3363049a2eb1a531fdf5d9f823553f5a4 WatchSource:0}: Error finding container a24c67882606042f6f7267f5a9158bb3363049a2eb1a531fdf5d9f823553f5a4: Status 404 returned error can't find the container with id a24c67882606042f6f7267f5a9158bb3363049a2eb1a531fdf5d9f823553f5a4 Apr 16 14:39:46.907726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.907699 2576 generic.go:358] "Generic (PLEG): container finished" podID="2e6fd606-91a0-489e-967d-379339d856e8" containerID="780d384bd03247c16ed6e3b76f544ae06aaf6c084ec2bf60aa20b6837e40afbc" exitCode=0 Apr 16 14:39:46.908048 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.907772 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" event={"ID":"2e6fd606-91a0-489e-967d-379339d856e8","Type":"ContainerDied","Data":"780d384bd03247c16ed6e3b76f544ae06aaf6c084ec2bf60aa20b6837e40afbc"} Apr 16 14:39:46.908048 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.907800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" event={"ID":"2e6fd606-91a0-489e-967d-379339d856e8","Type":"ContainerStarted","Data":"344b91c653e4783295c13bbb68c71f59633e903d89b9281d385dee980962fe01"} Apr 16 14:39:46.909505 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.909484 2576 generic.go:358] "Generic (PLEG): container finished" podID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerID="3043c56f8bed52680e0c1b4ec7eca5881864865451e320fb2d89f3c3e6d170d7" exitCode=0 Apr 16 14:39:46.909672 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.909585 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" event={"ID":"43fc5738-3bf5-475d-857f-1f781f2e55a6","Type":"ContainerDied","Data":"3043c56f8bed52680e0c1b4ec7eca5881864865451e320fb2d89f3c3e6d170d7"} Apr 16 14:39:46.911311 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.911276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" event={"ID":"edf5b29e-5d29-47f7-9873-699371d62d3c","Type":"ContainerStarted","Data":"8a998100419e0444ad9e216daa2c741f44d77832f0e922d918ca742cb1352b4c"} Apr 16 14:39:46.912660 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.912635 2576 generic.go:358] "Generic (PLEG): container finished" podID="375b213e-688f-4f46-83df-921c9b71892d" containerID="ddb1c3e7bb7d215583a2b0fb25b1dd72392d2a367238e792085c3cb7cc585f5f" exitCode=0 Apr 16 14:39:46.912735 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.912716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" event={"ID":"375b213e-688f-4f46-83df-921c9b71892d","Type":"ContainerDied","Data":"ddb1c3e7bb7d215583a2b0fb25b1dd72392d2a367238e792085c3cb7cc585f5f"} Apr 16 14:39:46.912819 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:46.912744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" event={"ID":"375b213e-688f-4f46-83df-921c9b71892d","Type":"ContainerStarted","Data":"a24c67882606042f6f7267f5a9158bb3363049a2eb1a531fdf5d9f823553f5a4"} Apr 16 14:39:47.918239 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.918204 2576 generic.go:358] "Generic (PLEG): container finished" podID="2e6fd606-91a0-489e-967d-379339d856e8" containerID="970208880b428f18938451c10cac98e492a90ce52b0dd6fb6e67cd33f25ec58d" exitCode=0 Apr 16 14:39:47.918670 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.918281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" event={"ID":"2e6fd606-91a0-489e-967d-379339d856e8","Type":"ContainerDied","Data":"970208880b428f18938451c10cac98e492a90ce52b0dd6fb6e67cd33f25ec58d"} Apr 16 14:39:47.920352 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.920328 2576 generic.go:358] "Generic (PLEG): container finished" podID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerID="1c4af30f60d30605d88fe184f35465fb36d0e315a8f65dfd208c11a3a700973a" exitCode=0 Apr 16 14:39:47.920477 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.920393 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" event={"ID":"43fc5738-3bf5-475d-857f-1f781f2e55a6","Type":"ContainerDied","Data":"1c4af30f60d30605d88fe184f35465fb36d0e315a8f65dfd208c11a3a700973a"} Apr 16 14:39:47.922179 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.922155 2576 generic.go:358] "Generic (PLEG): container finished" podID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerID="8a998100419e0444ad9e216daa2c741f44d77832f0e922d918ca742cb1352b4c" exitCode=0 Apr 16 14:39:47.922179 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.922177 2576 generic.go:358] "Generic (PLEG): container finished" podID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerID="d302fe1f11d78c450567c1a41374a1a0458a621683b66a76427aecc15f561785" exitCode=0 Apr 16 14:39:47.922343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.922224 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" event={"ID":"edf5b29e-5d29-47f7-9873-699371d62d3c","Type":"ContainerDied","Data":"8a998100419e0444ad9e216daa2c741f44d77832f0e922d918ca742cb1352b4c"} Apr 16 14:39:47.922343 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.922247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" event={"ID":"edf5b29e-5d29-47f7-9873-699371d62d3c","Type":"ContainerDied","Data":"d302fe1f11d78c450567c1a41374a1a0458a621683b66a76427aecc15f561785"} Apr 16 14:39:47.923975 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.923956 2576 generic.go:358] "Generic (PLEG): container finished" podID="375b213e-688f-4f46-83df-921c9b71892d" containerID="20119581882f5d47ab5ade2c2c54beed21066a77f0720e9c423fcd95a2c7d4cd" exitCode=0 Apr 16 14:39:47.924075 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:47.924003 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" event={"ID":"375b213e-688f-4f46-83df-921c9b71892d","Type":"ContainerDied","Data":"20119581882f5d47ab5ade2c2c54beed21066a77f0720e9c423fcd95a2c7d4cd"} Apr 16 14:39:48.929800 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:48.929763 2576 generic.go:358] "Generic (PLEG): container finished" podID="375b213e-688f-4f46-83df-921c9b71892d" containerID="1c75d01d032be39d780bfc11392017206e4bb4c3bbef56929e37e5607d305f03" exitCode=0 Apr 16 14:39:48.930244 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:48.929842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" event={"ID":"375b213e-688f-4f46-83df-921c9b71892d","Type":"ContainerDied","Data":"1c75d01d032be39d780bfc11392017206e4bb4c3bbef56929e37e5607d305f03"} Apr 16 14:39:48.931711 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:48.931689 2576 generic.go:358] "Generic (PLEG): container finished" podID="2e6fd606-91a0-489e-967d-379339d856e8" containerID="53ac48ceb567e4252a7ed6068101857e8b8a05aae0fe8152d5d2d939400309cc" exitCode=0 Apr 16 14:39:48.931822 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:48.931778 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" event={"ID":"2e6fd606-91a0-489e-967d-379339d856e8","Type":"ContainerDied","Data":"53ac48ceb567e4252a7ed6068101857e8b8a05aae0fe8152d5d2d939400309cc"} Apr 16 14:39:49.080098 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.080075 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:49.083126 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.083105 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:49.152744 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.152707 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-util\") pod \"edf5b29e-5d29-47f7-9873-699371d62d3c\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " Apr 16 14:39:49.152744 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.152745 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-util\") pod \"43fc5738-3bf5-475d-857f-1f781f2e55a6\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " Apr 16 14:39:49.152984 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.152779 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlt9c\" (UniqueName: \"kubernetes.io/projected/43fc5738-3bf5-475d-857f-1f781f2e55a6-kube-api-access-zlt9c\") pod \"43fc5738-3bf5-475d-857f-1f781f2e55a6\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " Apr 16 14:39:49.152984 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.152806 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qfcm\" (UniqueName: \"kubernetes.io/projected/edf5b29e-5d29-47f7-9873-699371d62d3c-kube-api-access-9qfcm\") pod \"edf5b29e-5d29-47f7-9873-699371d62d3c\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " Apr 16 14:39:49.152984 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.152842 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-bundle\") pod \"edf5b29e-5d29-47f7-9873-699371d62d3c\" (UID: \"edf5b29e-5d29-47f7-9873-699371d62d3c\") " Apr 16 14:39:49.152984 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.152881 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-bundle\") pod \"43fc5738-3bf5-475d-857f-1f781f2e55a6\" (UID: \"43fc5738-3bf5-475d-857f-1f781f2e55a6\") " Apr 16 14:39:49.153457 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.153425 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-bundle" (OuterVolumeSpecName: "bundle") pod "edf5b29e-5d29-47f7-9873-699371d62d3c" (UID: "edf5b29e-5d29-47f7-9873-699371d62d3c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:49.153590 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.153563 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-bundle" (OuterVolumeSpecName: "bundle") pod "43fc5738-3bf5-475d-857f-1f781f2e55a6" (UID: "43fc5738-3bf5-475d-857f-1f781f2e55a6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:49.155656 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.155624 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fc5738-3bf5-475d-857f-1f781f2e55a6-kube-api-access-zlt9c" (OuterVolumeSpecName: "kube-api-access-zlt9c") pod "43fc5738-3bf5-475d-857f-1f781f2e55a6" (UID: "43fc5738-3bf5-475d-857f-1f781f2e55a6"). InnerVolumeSpecName "kube-api-access-zlt9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:49.155656 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.155631 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edf5b29e-5d29-47f7-9873-699371d62d3c-kube-api-access-9qfcm" (OuterVolumeSpecName: "kube-api-access-9qfcm") pod "edf5b29e-5d29-47f7-9873-699371d62d3c" (UID: "edf5b29e-5d29-47f7-9873-699371d62d3c"). InnerVolumeSpecName "kube-api-access-9qfcm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:49.159217 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.159192 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-util" (OuterVolumeSpecName: "util") pod "43fc5738-3bf5-475d-857f-1f781f2e55a6" (UID: "43fc5738-3bf5-475d-857f-1f781f2e55a6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:49.159961 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.159937 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-util" (OuterVolumeSpecName: "util") pod "edf5b29e-5d29-47f7-9873-699371d62d3c" (UID: "edf5b29e-5d29-47f7-9873-699371d62d3c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:49.254496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.254409 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:49.254496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.254440 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:49.254496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.254452 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43fc5738-3bf5-475d-857f-1f781f2e55a6-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:49.254496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.254461 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zlt9c\" (UniqueName: \"kubernetes.io/projected/43fc5738-3bf5-475d-857f-1f781f2e55a6-kube-api-access-zlt9c\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:49.254496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.254470 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9qfcm\" (UniqueName: \"kubernetes.io/projected/edf5b29e-5d29-47f7-9873-699371d62d3c-kube-api-access-9qfcm\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:49.254496 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.254479 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/edf5b29e-5d29-47f7-9873-699371d62d3c-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:49.940237 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.940195 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" event={"ID":"43fc5738-3bf5-475d-857f-1f781f2e55a6","Type":"ContainerDied","Data":"b8aae9bb88bf4ca55c3275b833882c77cc20766a7f99ffbdbac736fd62bf9161"} Apr 16 14:39:49.940237 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.940235 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8aae9bb88bf4ca55c3275b833882c77cc20766a7f99ffbdbac736fd62bf9161" Apr 16 14:39:49.940778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.940253 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7" Apr 16 14:39:49.941951 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.941910 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" event={"ID":"edf5b29e-5d29-47f7-9873-699371d62d3c","Type":"ContainerDied","Data":"ff3822d8c183ce883812e1c3c659d76fc5e9a1e2040081ea77ea9c0b7dbcc6d9"} Apr 16 14:39:49.941951 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.941946 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n" Apr 16 14:39:49.941951 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:49.941953 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff3822d8c183ce883812e1c3c659d76fc5e9a1e2040081ea77ea9c0b7dbcc6d9" Apr 16 14:39:50.091632 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.091609 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:50.094768 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.094750 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:50.161021 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.160988 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-util\") pod \"2e6fd606-91a0-489e-967d-379339d856e8\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " Apr 16 14:39:50.161188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.161033 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-bundle\") pod \"2e6fd606-91a0-489e-967d-379339d856e8\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " Apr 16 14:39:50.161188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.161077 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8r94\" (UniqueName: \"kubernetes.io/projected/2e6fd606-91a0-489e-967d-379339d856e8-kube-api-access-w8r94\") pod \"2e6fd606-91a0-489e-967d-379339d856e8\" (UID: \"2e6fd606-91a0-489e-967d-379339d856e8\") " Apr 16 14:39:50.161559 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.161517 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-bundle" (OuterVolumeSpecName: "bundle") pod "2e6fd606-91a0-489e-967d-379339d856e8" (UID: "2e6fd606-91a0-489e-967d-379339d856e8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:50.163148 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.163113 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6fd606-91a0-489e-967d-379339d856e8-kube-api-access-w8r94" (OuterVolumeSpecName: "kube-api-access-w8r94") pod "2e6fd606-91a0-489e-967d-379339d856e8" (UID: "2e6fd606-91a0-489e-967d-379339d856e8"). InnerVolumeSpecName "kube-api-access-w8r94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:50.166086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.166064 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-util" (OuterVolumeSpecName: "util") pod "2e6fd606-91a0-489e-967d-379339d856e8" (UID: "2e6fd606-91a0-489e-967d-379339d856e8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:50.262339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.262249 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn5ls\" (UniqueName: \"kubernetes.io/projected/375b213e-688f-4f46-83df-921c9b71892d-kube-api-access-vn5ls\") pod \"375b213e-688f-4f46-83df-921c9b71892d\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " Apr 16 14:39:50.262339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.262312 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-util\") pod \"375b213e-688f-4f46-83df-921c9b71892d\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " Apr 16 14:39:50.262584 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.262393 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-bundle\") pod \"375b213e-688f-4f46-83df-921c9b71892d\" (UID: \"375b213e-688f-4f46-83df-921c9b71892d\") " Apr 16 14:39:50.264026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.263025 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:50.264026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.263069 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2e6fd606-91a0-489e-967d-379339d856e8-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:50.264026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.263087 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w8r94\" (UniqueName: \"kubernetes.io/projected/2e6fd606-91a0-489e-967d-379339d856e8-kube-api-access-w8r94\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:50.264026 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.263297 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-bundle" (OuterVolumeSpecName: "bundle") pod "375b213e-688f-4f46-83df-921c9b71892d" (UID: "375b213e-688f-4f46-83df-921c9b71892d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:50.269672 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.265292 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375b213e-688f-4f46-83df-921c9b71892d-kube-api-access-vn5ls" (OuterVolumeSpecName: "kube-api-access-vn5ls") pod "375b213e-688f-4f46-83df-921c9b71892d" (UID: "375b213e-688f-4f46-83df-921c9b71892d"). InnerVolumeSpecName "kube-api-access-vn5ls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:39:50.270524 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.270495 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-util" (OuterVolumeSpecName: "util") pod "375b213e-688f-4f46-83df-921c9b71892d" (UID: "375b213e-688f-4f46-83df-921c9b71892d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:39:50.363750 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.363710 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-bundle\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:50.363750 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.363743 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vn5ls\" (UniqueName: \"kubernetes.io/projected/375b213e-688f-4f46-83df-921c9b71892d-kube-api-access-vn5ls\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:50.363750 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.363753 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/375b213e-688f-4f46-83df-921c9b71892d-util\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:39:50.948743 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.948700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" event={"ID":"2e6fd606-91a0-489e-967d-379339d856e8","Type":"ContainerDied","Data":"344b91c653e4783295c13bbb68c71f59633e903d89b9281d385dee980962fe01"} Apr 16 14:39:50.948743 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.948739 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344b91c653e4783295c13bbb68c71f59633e903d89b9281d385dee980962fe01" Apr 16 14:39:50.948743 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.948744 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj" Apr 16 14:39:50.950468 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.950441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" event={"ID":"375b213e-688f-4f46-83df-921c9b71892d","Type":"ContainerDied","Data":"a24c67882606042f6f7267f5a9158bb3363049a2eb1a531fdf5d9f823553f5a4"} Apr 16 14:39:50.950468 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.950459 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h" Apr 16 14:39:50.950662 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:39:50.950471 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24c67882606042f6f7267f5a9158bb3363049a2eb1a531fdf5d9f823553f5a4" Apr 16 14:40:11.020613 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.020507 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr"] Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.020960 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.020979 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.020992 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021000 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021011 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021019 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021030 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021038 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021046 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021055 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021069 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021077 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021096 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021104 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021117 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021128 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021144 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021155 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021175 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021187 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="util" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021204 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021215 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021232 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021248 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="pull" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021355 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="43fc5738-3bf5-475d-857f-1f781f2e55a6" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021370 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e6fd606-91a0-489e-967d-379339d856e8" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021381 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="edf5b29e-5d29-47f7-9873-699371d62d3c" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.021393 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="375b213e-688f-4f46-83df-921c9b71892d" containerName="extract" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.024736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.027976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.028098 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.028153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.028924 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.028933 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-r2tmc\"" Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.033995 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr"] Apr 16 14:40:11.051371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.044459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657765-3944-4f5d-b443-7325b472bdc7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.052764 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.044562 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bb657765-3944-4f5d-b443-7325b472bdc7-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.052764 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.044582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdhtq\" (UniqueName: \"kubernetes.io/projected/bb657765-3944-4f5d-b443-7325b472bdc7-kube-api-access-hdhtq\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.145986 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.145950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bb657765-3944-4f5d-b443-7325b472bdc7-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.145986 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.145985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdhtq\" (UniqueName: \"kubernetes.io/projected/bb657765-3944-4f5d-b443-7325b472bdc7-kube-api-access-hdhtq\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.146247 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.146040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657765-3944-4f5d-b443-7325b472bdc7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.146247 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:40:11.146138 2576 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 14:40:11.146247 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:40:11.146217 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bb657765-3944-4f5d-b443-7325b472bdc7-plugin-serving-cert podName:bb657765-3944-4f5d-b443-7325b472bdc7 nodeName:}" failed. No retries permitted until 2026-04-16 14:40:11.646195643 +0000 UTC m=+627.472906038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/bb657765-3944-4f5d-b443-7325b472bdc7-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-r8lmr" (UID: "bb657765-3944-4f5d-b443-7325b472bdc7") : secret "plugin-serving-cert" not found Apr 16 14:40:11.146698 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.146673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/bb657765-3944-4f5d-b443-7325b472bdc7-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.157944 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.157921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdhtq\" (UniqueName: \"kubernetes.io/projected/bb657765-3944-4f5d-b443-7325b472bdc7-kube-api-access-hdhtq\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.651589 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.651546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657765-3944-4f5d-b443-7325b472bdc7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.653983 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.653963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/bb657765-3944-4f5d-b443-7325b472bdc7-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-r8lmr\" (UID: \"bb657765-3944-4f5d-b443-7325b472bdc7\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:11.934486 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:11.934394 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" Apr 16 14:40:12.063588 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:12.063559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr"] Apr 16 14:40:12.065251 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:40:12.065226 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb657765_3944_4f5d_b443_7325b472bdc7.slice/crio-143b0a15b0c9594efec5bc3873d3e4a3069f5656556690710f00fd07c89a833d WatchSource:0}: Error finding container 143b0a15b0c9594efec5bc3873d3e4a3069f5656556690710f00fd07c89a833d: Status 404 returned error can't find the container with id 143b0a15b0c9594efec5bc3873d3e4a3069f5656556690710f00fd07c89a833d Apr 16 14:40:12.066475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:12.066455 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:40:13.033680 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:13.033642 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" event={"ID":"bb657765-3944-4f5d-b443-7325b472bdc7","Type":"ContainerStarted","Data":"143b0a15b0c9594efec5bc3873d3e4a3069f5656556690710f00fd07c89a833d"} Apr 16 14:40:38.145995 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:38.145954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" event={"ID":"bb657765-3944-4f5d-b443-7325b472bdc7","Type":"ContainerStarted","Data":"bce195f8a94611f494014a61c8112135c47224ab3e2f06c775bbf1317f1d3f23"} Apr 16 14:40:38.165937 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:38.165894 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-r8lmr" podStartSLOduration=2.699865596 podStartE2EDuration="28.165880305s" podCreationTimestamp="2026-04-16 14:40:10 +0000 UTC" firstStartedPulling="2026-04-16 14:40:12.06660552 +0000 UTC m=+627.893315915" lastFinishedPulling="2026-04-16 14:40:37.532620225 +0000 UTC m=+653.359330624" observedRunningTime="2026-04-16 14:40:38.163624851 +0000 UTC m=+653.990335268" watchObservedRunningTime="2026-04-16 14:40:38.165880305 +0000 UTC m=+653.992590755" Apr 16 14:40:58.804464 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.804430 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qt8rr"] Apr 16 14:40:58.949234 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.949203 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qt8rr"] Apr 16 14:40:58.949234 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.949233 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-hlq5k"] Apr 16 14:40:58.949462 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.949369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:40:58.952010 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.951986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-gztnx\"" Apr 16 14:40:58.974738 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.974711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-hlq5k"] Apr 16 14:40:58.974862 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.974805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:40:58.980113 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:58.980082 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2n8r\" (UniqueName: \"kubernetes.io/projected/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8-kube-api-access-g2n8r\") pod \"authorino-f99f4b5cd-qt8rr\" (UID: \"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8\") " pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:40:59.080726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.080646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpdp\" (UniqueName: \"kubernetes.io/projected/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f-kube-api-access-5bpdp\") pod \"authorino-7498df8756-hlq5k\" (UID: \"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f\") " pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:40:59.080726 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.080687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2n8r\" (UniqueName: \"kubernetes.io/projected/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8-kube-api-access-g2n8r\") pod \"authorino-f99f4b5cd-qt8rr\" (UID: \"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8\") " pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:40:59.089767 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.089734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2n8r\" (UniqueName: \"kubernetes.io/projected/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8-kube-api-access-g2n8r\") pod \"authorino-f99f4b5cd-qt8rr\" (UID: \"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8\") " pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:40:59.181757 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.181722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpdp\" (UniqueName: \"kubernetes.io/projected/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f-kube-api-access-5bpdp\") pod \"authorino-7498df8756-hlq5k\" (UID: \"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f\") " pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:40:59.190167 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.190137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpdp\" (UniqueName: \"kubernetes.io/projected/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f-kube-api-access-5bpdp\") pod \"authorino-7498df8756-hlq5k\" (UID: \"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f\") " pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:40:59.259433 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.259392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:40:59.283442 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.283409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:40:59.418755 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.418719 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qt8rr"] Apr 16 14:40:59.420170 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:40:59.420140 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3885709_cdbe_4e6b_a1c0_34761e2f7ac8.slice/crio-60b2207e4617ab2f00735cf6d93d0609758c9e3aef6e63ef1bd389582e0529aa WatchSource:0}: Error finding container 60b2207e4617ab2f00735cf6d93d0609758c9e3aef6e63ef1bd389582e0529aa: Status 404 returned error can't find the container with id 60b2207e4617ab2f00735cf6d93d0609758c9e3aef6e63ef1bd389582e0529aa Apr 16 14:40:59.436833 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:40:59.436805 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-hlq5k"] Apr 16 14:40:59.438101 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:40:59.438078 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod901cc156_adbd_4b4a_ba7f_6ec0ec0f9a0f.slice/crio-cc0d930effa9c16cdf426457657e4e9f1054969face3c93db7b67ee13915e561 WatchSource:0}: Error finding container cc0d930effa9c16cdf426457657e4e9f1054969face3c93db7b67ee13915e561: Status 404 returned error can't find the container with id cc0d930effa9c16cdf426457657e4e9f1054969face3c93db7b67ee13915e561 Apr 16 14:41:00.228453 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:00.228420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hlq5k" event={"ID":"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f","Type":"ContainerStarted","Data":"cc0d930effa9c16cdf426457657e4e9f1054969face3c93db7b67ee13915e561"} Apr 16 14:41:00.229683 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:00.229658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" event={"ID":"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8","Type":"ContainerStarted","Data":"60b2207e4617ab2f00735cf6d93d0609758c9e3aef6e63ef1bd389582e0529aa"} Apr 16 14:41:03.242706 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:03.242661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hlq5k" event={"ID":"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f","Type":"ContainerStarted","Data":"7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476"} Apr 16 14:41:03.244055 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:03.244028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" event={"ID":"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8","Type":"ContainerStarted","Data":"d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367"} Apr 16 14:41:03.263488 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:03.263441 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-hlq5k" podStartSLOduration=2.25124594 podStartE2EDuration="5.263426939s" podCreationTimestamp="2026-04-16 14:40:58 +0000 UTC" firstStartedPulling="2026-04-16 14:40:59.439441818 +0000 UTC m=+675.266152213" lastFinishedPulling="2026-04-16 14:41:02.451622805 +0000 UTC m=+678.278333212" observedRunningTime="2026-04-16 14:41:03.261526829 +0000 UTC m=+679.088237247" watchObservedRunningTime="2026-04-16 14:41:03.263426939 +0000 UTC m=+679.090137356" Apr 16 14:41:03.276695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:03.276648 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" podStartSLOduration=2.23814171 podStartE2EDuration="5.276631226s" podCreationTimestamp="2026-04-16 14:40:58 +0000 UTC" firstStartedPulling="2026-04-16 14:40:59.421607647 +0000 UTC m=+675.248318043" lastFinishedPulling="2026-04-16 14:41:02.460097163 +0000 UTC m=+678.286807559" observedRunningTime="2026-04-16 14:41:03.276493556 +0000 UTC m=+679.103203977" watchObservedRunningTime="2026-04-16 14:41:03.276631226 +0000 UTC m=+679.103341643" Apr 16 14:41:03.309290 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:03.309255 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qt8rr"] Apr 16 14:41:05.250468 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:05.250430 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" podUID="a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" containerName="authorino" containerID="cri-o://d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367" gracePeriod=30 Apr 16 14:41:06.032925 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.032902 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:41:06.147778 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.147738 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2n8r\" (UniqueName: \"kubernetes.io/projected/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8-kube-api-access-g2n8r\") pod \"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8\" (UID: \"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8\") " Apr 16 14:41:06.149862 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.149835 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8-kube-api-access-g2n8r" (OuterVolumeSpecName: "kube-api-access-g2n8r") pod "a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" (UID: "a3885709-cdbe-4e6b-a1c0-34761e2f7ac8"). InnerVolumeSpecName "kube-api-access-g2n8r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:06.248995 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.248912 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2n8r\" (UniqueName: \"kubernetes.io/projected/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8-kube-api-access-g2n8r\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:41:06.255086 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.255058 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" containerID="d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367" exitCode=0 Apr 16 14:41:06.255448 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.255110 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" Apr 16 14:41:06.255448 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.255145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" event={"ID":"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8","Type":"ContainerDied","Data":"d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367"} Apr 16 14:41:06.255448 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.255180 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-qt8rr" event={"ID":"a3885709-cdbe-4e6b-a1c0-34761e2f7ac8","Type":"ContainerDied","Data":"60b2207e4617ab2f00735cf6d93d0609758c9e3aef6e63ef1bd389582e0529aa"} Apr 16 14:41:06.255448 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.255196 2576 scope.go:117] "RemoveContainer" containerID="d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367" Apr 16 14:41:06.264744 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.264724 2576 scope.go:117] "RemoveContainer" containerID="d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367" Apr 16 14:41:06.264989 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:41:06.264970 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367\": container with ID starting with d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367 not found: ID does not exist" containerID="d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367" Apr 16 14:41:06.265033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.264999 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367"} err="failed to get container status \"d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367\": rpc error: code = NotFound desc = could not find container \"d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367\": container with ID starting with d465835081146d3338fa1dd74a70c9b41665df51a9b3d0e5720329e05bd3c367 not found: ID does not exist" Apr 16 14:41:06.277689 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.277661 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qt8rr"] Apr 16 14:41:06.283445 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.283425 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-qt8rr"] Apr 16 14:41:06.783066 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:06.783032 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" path="/var/lib/kubelet/pods/a3885709-cdbe-4e6b-a1c0-34761e2f7ac8/volumes" Apr 16 14:41:32.987269 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:32.987234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-f5h67"] Apr 16 14:41:32.987671 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:32.987600 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" containerName="authorino" Apr 16 14:41:32.987671 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:32.987613 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" containerName="authorino" Apr 16 14:41:32.987747 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:32.987674 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3885709-cdbe-4e6b-a1c0-34761e2f7ac8" containerName="authorino" Apr 16 14:41:33.039089 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.039049 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-f5h67"] Apr 16 14:41:33.039254 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.039205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:33.187298 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.187258 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjzk\" (UniqueName: \"kubernetes.io/projected/4ab0e792-1dcf-493b-bb3c-cfc63047f770-kube-api-access-fcjzk\") pod \"authorino-8b475cf9f-f5h67\" (UID: \"4ab0e792-1dcf-493b-bb3c-cfc63047f770\") " pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:33.210991 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.210949 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-f5h67"] Apr 16 14:41:33.211248 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:41:33.211224 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-fcjzk], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-f5h67" podUID="4ab0e792-1dcf-493b-bb3c-cfc63047f770" Apr 16 14:41:33.243455 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.243388 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7556f57795-zgwpl"] Apr 16 14:41:33.246249 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.246232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:33.253871 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.253848 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7556f57795-zgwpl"] Apr 16 14:41:33.288083 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.288041 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjzk\" (UniqueName: \"kubernetes.io/projected/4ab0e792-1dcf-493b-bb3c-cfc63047f770-kube-api-access-fcjzk\") pod \"authorino-8b475cf9f-f5h67\" (UID: \"4ab0e792-1dcf-493b-bb3c-cfc63047f770\") " pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:33.296785 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.296755 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjzk\" (UniqueName: \"kubernetes.io/projected/4ab0e792-1dcf-493b-bb3c-cfc63047f770-kube-api-access-fcjzk\") pod \"authorino-8b475cf9f-f5h67\" (UID: \"4ab0e792-1dcf-493b-bb3c-cfc63047f770\") " pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:33.363791 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.363718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:33.368638 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.368611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:33.389128 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.389100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nglzj\" (UniqueName: \"kubernetes.io/projected/9ea2f0f9-7538-4f42-b50a-4e842a655ec3-kube-api-access-nglzj\") pod \"authorino-7556f57795-zgwpl\" (UID: \"9ea2f0f9-7538-4f42-b50a-4e842a655ec3\") " pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:33.400963 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.400931 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7556f57795-zgwpl"] Apr 16 14:41:33.401160 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:41:33.401142 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-nglzj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7556f57795-zgwpl" podUID="9ea2f0f9-7538-4f42-b50a-4e842a655ec3" Apr 16 14:41:33.435339 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.435289 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7588c67995-wqwhq"] Apr 16 14:41:33.438171 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.438150 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.440786 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.440764 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 14:41:33.451433 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.450900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7588c67995-wqwhq"] Apr 16 14:41:33.489956 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.489920 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcjzk\" (UniqueName: \"kubernetes.io/projected/4ab0e792-1dcf-493b-bb3c-cfc63047f770-kube-api-access-fcjzk\") pod \"4ab0e792-1dcf-493b-bb3c-cfc63047f770\" (UID: \"4ab0e792-1dcf-493b-bb3c-cfc63047f770\") " Apr 16 14:41:33.490152 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.490074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nglzj\" (UniqueName: \"kubernetes.io/projected/9ea2f0f9-7538-4f42-b50a-4e842a655ec3-kube-api-access-nglzj\") pod \"authorino-7556f57795-zgwpl\" (UID: \"9ea2f0f9-7538-4f42-b50a-4e842a655ec3\") " pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:33.492181 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.492149 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ab0e792-1dcf-493b-bb3c-cfc63047f770-kube-api-access-fcjzk" (OuterVolumeSpecName: "kube-api-access-fcjzk") pod "4ab0e792-1dcf-493b-bb3c-cfc63047f770" (UID: "4ab0e792-1dcf-493b-bb3c-cfc63047f770"). InnerVolumeSpecName "kube-api-access-fcjzk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:33.499188 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.499166 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nglzj\" (UniqueName: \"kubernetes.io/projected/9ea2f0f9-7538-4f42-b50a-4e842a655ec3-kube-api-access-nglzj\") pod \"authorino-7556f57795-zgwpl\" (UID: \"9ea2f0f9-7538-4f42-b50a-4e842a655ec3\") " pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:33.591449 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.591411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xh68\" (UniqueName: \"kubernetes.io/projected/18ba0e62-34a8-4ada-abe9-de9e4362ea15-kube-api-access-9xh68\") pod \"authorino-7588c67995-wqwhq\" (UID: \"18ba0e62-34a8-4ada-abe9-de9e4362ea15\") " pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.591676 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.591494 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/18ba0e62-34a8-4ada-abe9-de9e4362ea15-tls-cert\") pod \"authorino-7588c67995-wqwhq\" (UID: \"18ba0e62-34a8-4ada-abe9-de9e4362ea15\") " pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.591676 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.591603 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fcjzk\" (UniqueName: \"kubernetes.io/projected/4ab0e792-1dcf-493b-bb3c-cfc63047f770-kube-api-access-fcjzk\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:41:33.692176 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.692128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xh68\" (UniqueName: \"kubernetes.io/projected/18ba0e62-34a8-4ada-abe9-de9e4362ea15-kube-api-access-9xh68\") pod \"authorino-7588c67995-wqwhq\" (UID: \"18ba0e62-34a8-4ada-abe9-de9e4362ea15\") " pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.692381 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.692187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/18ba0e62-34a8-4ada-abe9-de9e4362ea15-tls-cert\") pod \"authorino-7588c67995-wqwhq\" (UID: \"18ba0e62-34a8-4ada-abe9-de9e4362ea15\") " pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.694595 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.694574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/18ba0e62-34a8-4ada-abe9-de9e4362ea15-tls-cert\") pod \"authorino-7588c67995-wqwhq\" (UID: \"18ba0e62-34a8-4ada-abe9-de9e4362ea15\") " pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.701854 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.701821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xh68\" (UniqueName: \"kubernetes.io/projected/18ba0e62-34a8-4ada-abe9-de9e4362ea15-kube-api-access-9xh68\") pod \"authorino-7588c67995-wqwhq\" (UID: \"18ba0e62-34a8-4ada-abe9-de9e4362ea15\") " pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.755765 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.755671 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7588c67995-wqwhq" Apr 16 14:41:33.884475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:33.884447 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7588c67995-wqwhq"] Apr 16 14:41:34.369003 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.368968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7588c67995-wqwhq" event={"ID":"18ba0e62-34a8-4ada-abe9-de9e4362ea15","Type":"ContainerStarted","Data":"a78c68ffd89e0f1cc5f5b24568f4129508e47236e4c1a2b130ea0c6c0addc45c"} Apr 16 14:41:34.369357 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.369009 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-f5h67" Apr 16 14:41:34.369357 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.369013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:34.374135 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.374111 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:34.408340 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.408311 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-f5h67"] Apr 16 14:41:34.414155 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.414128 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-f5h67"] Apr 16 14:41:34.500120 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.500085 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nglzj\" (UniqueName: \"kubernetes.io/projected/9ea2f0f9-7538-4f42-b50a-4e842a655ec3-kube-api-access-nglzj\") pod \"9ea2f0f9-7538-4f42-b50a-4e842a655ec3\" (UID: \"9ea2f0f9-7538-4f42-b50a-4e842a655ec3\") " Apr 16 14:41:34.501946 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.501913 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea2f0f9-7538-4f42-b50a-4e842a655ec3-kube-api-access-nglzj" (OuterVolumeSpecName: "kube-api-access-nglzj") pod "9ea2f0f9-7538-4f42-b50a-4e842a655ec3" (UID: "9ea2f0f9-7538-4f42-b50a-4e842a655ec3"). InnerVolumeSpecName "kube-api-access-nglzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:34.601633 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.601601 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nglzj\" (UniqueName: \"kubernetes.io/projected/9ea2f0f9-7538-4f42-b50a-4e842a655ec3-kube-api-access-nglzj\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:41:34.783816 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:34.783737 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ab0e792-1dcf-493b-bb3c-cfc63047f770" path="/var/lib/kubelet/pods/4ab0e792-1dcf-493b-bb3c-cfc63047f770/volumes" Apr 16 14:41:35.373892 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.373860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7588c67995-wqwhq" event={"ID":"18ba0e62-34a8-4ada-abe9-de9e4362ea15","Type":"ContainerStarted","Data":"c14ad93f5ab2dce2a7fd683ad2a2807ec99ea7c5194dbb585528a1c2fcb8eb4e"} Apr 16 14:41:35.374334 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.373914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7556f57795-zgwpl" Apr 16 14:41:35.391698 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.391652 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7588c67995-wqwhq" podStartSLOduration=1.865227128 podStartE2EDuration="2.391636217s" podCreationTimestamp="2026-04-16 14:41:33 +0000 UTC" firstStartedPulling="2026-04-16 14:41:33.889012146 +0000 UTC m=+709.715722543" lastFinishedPulling="2026-04-16 14:41:34.415421222 +0000 UTC m=+710.242131632" observedRunningTime="2026-04-16 14:41:35.389448126 +0000 UTC m=+711.216158540" watchObservedRunningTime="2026-04-16 14:41:35.391636217 +0000 UTC m=+711.218346634" Apr 16 14:41:35.417170 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.417136 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7556f57795-zgwpl"] Apr 16 14:41:35.422338 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.422295 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-hlq5k"] Apr 16 14:41:35.422591 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.422560 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-hlq5k" podUID="901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" containerName="authorino" containerID="cri-o://7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476" gracePeriod=30 Apr 16 14:41:35.425466 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.425442 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7556f57795-zgwpl"] Apr 16 14:41:35.673751 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.673721 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:41:35.812152 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.812118 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bpdp\" (UniqueName: \"kubernetes.io/projected/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f-kube-api-access-5bpdp\") pod \"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f\" (UID: \"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f\") " Apr 16 14:41:35.814300 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.814260 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f-kube-api-access-5bpdp" (OuterVolumeSpecName: "kube-api-access-5bpdp") pod "901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" (UID: "901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f"). InnerVolumeSpecName "kube-api-access-5bpdp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:41:35.913033 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:35.912941 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bpdp\" (UniqueName: \"kubernetes.io/projected/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f-kube-api-access-5bpdp\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:41:36.379170 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.379139 2576 generic.go:358] "Generic (PLEG): container finished" podID="901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" containerID="7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476" exitCode=0 Apr 16 14:41:36.379701 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.379193 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hlq5k" Apr 16 14:41:36.379701 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.379225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hlq5k" event={"ID":"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f","Type":"ContainerDied","Data":"7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476"} Apr 16 14:41:36.379701 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.379267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hlq5k" event={"ID":"901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f","Type":"ContainerDied","Data":"cc0d930effa9c16cdf426457657e4e9f1054969face3c93db7b67ee13915e561"} Apr 16 14:41:36.379701 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.379288 2576 scope.go:117] "RemoveContainer" containerID="7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476" Apr 16 14:41:36.388649 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.388629 2576 scope.go:117] "RemoveContainer" containerID="7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476" Apr 16 14:41:36.388907 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:41:36.388890 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476\": container with ID starting with 7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476 not found: ID does not exist" containerID="7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476" Apr 16 14:41:36.388952 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.388916 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476"} err="failed to get container status \"7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476\": rpc error: code = NotFound desc = could not find container \"7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476\": container with ID starting with 7d61970b75ce4a130f3179b48cfeec7219fdf0946177f6148ebef168e9451476 not found: ID does not exist" Apr 16 14:41:36.401689 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.401659 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-hlq5k"] Apr 16 14:41:36.405456 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.405435 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-hlq5k"] Apr 16 14:41:36.462064 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.462033 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-785774b7b6-ps6rz"] Apr 16 14:41:36.462390 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.462379 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" containerName="authorino" Apr 16 14:41:36.462433 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.462392 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" containerName="authorino" Apr 16 14:41:36.462475 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.462466 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" containerName="authorino" Apr 16 14:41:36.466137 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.466119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:36.468235 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.468218 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-tjcpj\"" Apr 16 14:41:36.474961 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.474938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-785774b7b6-ps6rz"] Apr 16 14:41:36.619231 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.619193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4fv\" (UniqueName: \"kubernetes.io/projected/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1-kube-api-access-zr4fv\") pod \"maas-controller-785774b7b6-ps6rz\" (UID: \"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1\") " pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:36.720342 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.720251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4fv\" (UniqueName: \"kubernetes.io/projected/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1-kube-api-access-zr4fv\") pod \"maas-controller-785774b7b6-ps6rz\" (UID: \"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1\") " pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:36.729209 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.729180 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4fv\" (UniqueName: \"kubernetes.io/projected/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1-kube-api-access-zr4fv\") pod \"maas-controller-785774b7b6-ps6rz\" (UID: \"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1\") " pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:36.777546 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.777504 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:36.783109 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.783069 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f" path="/var/lib/kubelet/pods/901cc156-adbd-4b4a-ba7f-6ec0ec0f9a0f/volumes" Apr 16 14:41:36.783482 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.783462 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea2f0f9-7538-4f42-b50a-4e842a655ec3" path="/var/lib/kubelet/pods/9ea2f0f9-7538-4f42-b50a-4e842a655ec3/volumes" Apr 16 14:41:36.899737 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:36.899702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-785774b7b6-ps6rz"] Apr 16 14:41:36.901089 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:41:36.901059 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08bb2fc6_b7c5_4d6f_854a_2af65efd99c1.slice/crio-ecd973e1106f6d4737674e589b6954a3ada39620c7d0ebe4c30498b4dd6117f2 WatchSource:0}: Error finding container ecd973e1106f6d4737674e589b6954a3ada39620c7d0ebe4c30498b4dd6117f2: Status 404 returned error can't find the container with id ecd973e1106f6d4737674e589b6954a3ada39620c7d0ebe4c30498b4dd6117f2 Apr 16 14:41:37.388045 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:37.388012 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-785774b7b6-ps6rz" event={"ID":"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1","Type":"ContainerStarted","Data":"ecd973e1106f6d4737674e589b6954a3ada39620c7d0ebe4c30498b4dd6117f2"} Apr 16 14:41:40.403690 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:40.403656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-785774b7b6-ps6rz" event={"ID":"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1","Type":"ContainerStarted","Data":"7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7"} Apr 16 14:41:40.404081 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:40.403767 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:40.422245 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:40.422198 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-785774b7b6-ps6rz" podStartSLOduration=1.9628168750000001 podStartE2EDuration="4.422183517s" podCreationTimestamp="2026-04-16 14:41:36 +0000 UTC" firstStartedPulling="2026-04-16 14:41:36.902408599 +0000 UTC m=+712.729118994" lastFinishedPulling="2026-04-16 14:41:39.361775233 +0000 UTC m=+715.188485636" observedRunningTime="2026-04-16 14:41:40.420703754 +0000 UTC m=+716.247414171" watchObservedRunningTime="2026-04-16 14:41:40.422183517 +0000 UTC m=+716.248893934" Apr 16 14:41:51.415317 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.415281 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:41:51.772363 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.772274 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-54cfdc6dcf-pv79d"] Apr 16 14:41:51.777302 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.777284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:41:51.784001 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.783971 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54cfdc6dcf-pv79d"] Apr 16 14:41:51.841454 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.841420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z2q5\" (UniqueName: \"kubernetes.io/projected/39383956-71e4-486e-bd58-6751cd811608-kube-api-access-2z2q5\") pod \"maas-controller-54cfdc6dcf-pv79d\" (UID: \"39383956-71e4-486e-bd58-6751cd811608\") " pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:41:51.942349 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.942308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2z2q5\" (UniqueName: \"kubernetes.io/projected/39383956-71e4-486e-bd58-6751cd811608-kube-api-access-2z2q5\") pod \"maas-controller-54cfdc6dcf-pv79d\" (UID: \"39383956-71e4-486e-bd58-6751cd811608\") " pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:41:51.952460 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:51.952434 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z2q5\" (UniqueName: \"kubernetes.io/projected/39383956-71e4-486e-bd58-6751cd811608-kube-api-access-2z2q5\") pod \"maas-controller-54cfdc6dcf-pv79d\" (UID: \"39383956-71e4-486e-bd58-6751cd811608\") " pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:41:52.088846 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:52.088747 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:41:52.222178 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:52.222147 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54cfdc6dcf-pv79d"] Apr 16 14:41:52.224366 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:41:52.224337 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39383956_71e4_486e_bd58_6751cd811608.slice/crio-6c98b00355741f1b5d4c4fd5e6ab06acb341e64f98d23b14a45cb6de9db8d008 WatchSource:0}: Error finding container 6c98b00355741f1b5d4c4fd5e6ab06acb341e64f98d23b14a45cb6de9db8d008: Status 404 returned error can't find the container with id 6c98b00355741f1b5d4c4fd5e6ab06acb341e64f98d23b14a45cb6de9db8d008 Apr 16 14:41:52.457984 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:52.457950 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" event={"ID":"39383956-71e4-486e-bd58-6751cd811608","Type":"ContainerStarted","Data":"6c98b00355741f1b5d4c4fd5e6ab06acb341e64f98d23b14a45cb6de9db8d008"} Apr 16 14:41:53.462477 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:53.462441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" event={"ID":"39383956-71e4-486e-bd58-6751cd811608","Type":"ContainerStarted","Data":"4e73d5ab25677d93da99bb590c3d7977a701e295dcc405e99d5607fc7ab27731"} Apr 16 14:41:53.462906 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:53.462572 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:41:53.482119 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:41:53.482067 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" podStartSLOduration=2.147442403 podStartE2EDuration="2.482052214s" podCreationTimestamp="2026-04-16 14:41:51 +0000 UTC" firstStartedPulling="2026-04-16 14:41:52.225720878 +0000 UTC m=+728.052431273" lastFinishedPulling="2026-04-16 14:41:52.560330675 +0000 UTC m=+728.387041084" observedRunningTime="2026-04-16 14:41:53.479261309 +0000 UTC m=+729.305971727" watchObservedRunningTime="2026-04-16 14:41:53.482052214 +0000 UTC m=+729.308762630" Apr 16 14:42:04.474168 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.474134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-54cfdc6dcf-pv79d" Apr 16 14:42:04.520230 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.520190 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-785774b7b6-ps6rz"] Apr 16 14:42:04.520435 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.520412 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-785774b7b6-ps6rz" podUID="08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" containerName="manager" containerID="cri-o://7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7" gracePeriod=10 Apr 16 14:42:04.766799 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.766774 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:42:04.861165 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.861123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr4fv\" (UniqueName: \"kubernetes.io/projected/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1-kube-api-access-zr4fv\") pod \"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1\" (UID: \"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1\") " Apr 16 14:42:04.863355 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.863314 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1-kube-api-access-zr4fv" (OuterVolumeSpecName: "kube-api-access-zr4fv") pod "08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" (UID: "08bb2fc6-b7c5-4d6f-854a-2af65efd99c1"). InnerVolumeSpecName "kube-api-access-zr4fv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:42:04.962333 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:04.962294 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zr4fv\" (UniqueName: \"kubernetes.io/projected/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1-kube-api-access-zr4fv\") on node \"ip-10-0-128-173.ec2.internal\" DevicePath \"\"" Apr 16 14:42:05.507661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.507625 2576 generic.go:358] "Generic (PLEG): container finished" podID="08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" containerID="7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7" exitCode=0 Apr 16 14:42:05.508115 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.507692 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-785774b7b6-ps6rz" event={"ID":"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1","Type":"ContainerDied","Data":"7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7"} Apr 16 14:42:05.508115 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.507719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-785774b7b6-ps6rz" event={"ID":"08bb2fc6-b7c5-4d6f-854a-2af65efd99c1","Type":"ContainerDied","Data":"ecd973e1106f6d4737674e589b6954a3ada39620c7d0ebe4c30498b4dd6117f2"} Apr 16 14:42:05.508115 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.507716 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-785774b7b6-ps6rz" Apr 16 14:42:05.508115 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.507796 2576 scope.go:117] "RemoveContainer" containerID="7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7" Apr 16 14:42:05.516703 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.516686 2576 scope.go:117] "RemoveContainer" containerID="7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7" Apr 16 14:42:05.516968 ip-10-0-128-173 kubenswrapper[2576]: E0416 14:42:05.516952 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7\": container with ID starting with 7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7 not found: ID does not exist" containerID="7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7" Apr 16 14:42:05.517011 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.516975 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7"} err="failed to get container status \"7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7\": rpc error: code = NotFound desc = could not find container \"7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7\": container with ID starting with 7b9f652ae49bd7bd6adc0e088bc15f42e606b23a9712f37179ec064da2a0d0f7 not found: ID does not exist" Apr 16 14:42:05.531281 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.531250 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-785774b7b6-ps6rz"] Apr 16 14:42:05.533565 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:05.533523 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-785774b7b6-ps6rz"] Apr 16 14:42:06.782847 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:42:06.782810 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" path="/var/lib/kubelet/pods/08bb2fc6-b7c5-4d6f-854a-2af65efd99c1/volumes" Apr 16 14:44:44.704240 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:44:44.704162 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:44:44.705053 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:44:44.705030 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:49:44.739491 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:49:44.739460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:49:44.740517 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:49:44.740495 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7qpv_312033f4-d50f-4d5d-a1ca-6e77e0428786/ovn-acl-logging/0.log" Apr 16 14:52:12.386364 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:12.386284 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7588c67995-wqwhq_18ba0e62-34a8-4ada-abe9-de9e4362ea15/authorino/0.log" Apr 16 14:52:15.295715 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:15.295681 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-54cfdc6dcf-pv79d_39383956-71e4-486e-bd58-6751cd811608/manager/0.log" Apr 16 14:52:15.409189 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:15.409158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6jdtr_8a2890cc-c938-4ec0-ac01-7eea814cd68f/manager/2.log" Apr 16 14:52:15.774162 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:15.774127 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6f7bb56bb6-j99db_c50824db-3db3-4978-9e36-c8ee6ff1c646/manager/0.log" Apr 16 14:52:16.644710 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.644676 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj_2e6fd606-91a0-489e-967d-379339d856e8/util/0.log" Apr 16 14:52:16.652227 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.652190 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj_2e6fd606-91a0-489e-967d-379339d856e8/pull/0.log" Apr 16 14:52:16.659260 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.659236 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj_2e6fd606-91a0-489e-967d-379339d856e8/extract/0.log" Apr 16 14:52:16.773655 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.773603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h_375b213e-688f-4f46-83df-921c9b71892d/extract/0.log" Apr 16 14:52:16.781499 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.781474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h_375b213e-688f-4f46-83df-921c9b71892d/util/0.log" Apr 16 14:52:16.789374 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.789353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h_375b213e-688f-4f46-83df-921c9b71892d/pull/0.log" Apr 16 14:52:16.898429 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.898348 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7_43fc5738-3bf5-475d-857f-1f781f2e55a6/extract/0.log" Apr 16 14:52:16.904990 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.904966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7_43fc5738-3bf5-475d-857f-1f781f2e55a6/util/0.log" Apr 16 14:52:16.911801 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:16.911778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7_43fc5738-3bf5-475d-857f-1f781f2e55a6/pull/0.log" Apr 16 14:52:17.026453 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:17.026424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n_edf5b29e-5d29-47f7-9873-699371d62d3c/util/0.log" Apr 16 14:52:17.038679 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:17.038652 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n_edf5b29e-5d29-47f7-9873-699371d62d3c/pull/0.log" Apr 16 14:52:17.047112 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:17.047080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n_edf5b29e-5d29-47f7-9873-699371d62d3c/extract/0.log" Apr 16 14:52:17.158050 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:17.157965 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7588c67995-wqwhq_18ba0e62-34a8-4ada-abe9-de9e4362ea15/authorino/0.log" Apr 16 14:52:17.500469 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:17.500387 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-r8lmr_bb657765-3944-4f5d-b443-7325b472bdc7/kuadrant-console-plugin/0.log" Apr 16 14:52:18.642583 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:18.642555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6468c4fc75-pjxlq_d568b9f7-2358-4b4c-8148-a6a3380124ab/kube-auth-proxy/0.log" Apr 16 14:52:18.877116 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:18.877088 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7f7fd7cc4b-cbgrm_93dee602-e658-4746-861c-192789af4e9c/router/0.log" Apr 16 14:52:22.960518 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.960479 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mcxmz/must-gather-2fpch"] Apr 16 14:52:22.960938 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.960859 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" containerName="manager" Apr 16 14:52:22.960938 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.960875 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" containerName="manager" Apr 16 14:52:22.961025 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.960955 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="08bb2fc6-b7c5-4d6f-854a-2af65efd99c1" containerName="manager" Apr 16 14:52:22.964175 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.964156 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:22.966691 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.966665 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mcxmz\"/\"kube-root-ca.crt\"" Apr 16 14:52:22.966827 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.966768 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mcxmz\"/\"default-dockercfg-d2ggf\"" Apr 16 14:52:22.967661 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.967635 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mcxmz\"/\"openshift-service-ca.crt\"" Apr 16 14:52:22.979908 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:22.979884 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/must-gather-2fpch"] Apr 16 14:52:23.094082 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.094042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc7rd\" (UniqueName: \"kubernetes.io/projected/201647ee-d367-48da-bb1d-4e20082bc52a-kube-api-access-lc7rd\") pod \"must-gather-2fpch\" (UID: \"201647ee-d367-48da-bb1d-4e20082bc52a\") " pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.094271 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.094089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/201647ee-d367-48da-bb1d-4e20082bc52a-must-gather-output\") pod \"must-gather-2fpch\" (UID: \"201647ee-d367-48da-bb1d-4e20082bc52a\") " pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.194945 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.194904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lc7rd\" (UniqueName: \"kubernetes.io/projected/201647ee-d367-48da-bb1d-4e20082bc52a-kube-api-access-lc7rd\") pod \"must-gather-2fpch\" (UID: \"201647ee-d367-48da-bb1d-4e20082bc52a\") " pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.195121 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.194959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/201647ee-d367-48da-bb1d-4e20082bc52a-must-gather-output\") pod \"must-gather-2fpch\" (UID: \"201647ee-d367-48da-bb1d-4e20082bc52a\") " pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.195358 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.195333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/201647ee-d367-48da-bb1d-4e20082bc52a-must-gather-output\") pod \"must-gather-2fpch\" (UID: \"201647ee-d367-48da-bb1d-4e20082bc52a\") " pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.205327 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.205290 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc7rd\" (UniqueName: \"kubernetes.io/projected/201647ee-d367-48da-bb1d-4e20082bc52a-kube-api-access-lc7rd\") pod \"must-gather-2fpch\" (UID: \"201647ee-d367-48da-bb1d-4e20082bc52a\") " pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.273734 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.273649 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/must-gather-2fpch" Apr 16 14:52:23.403399 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.403365 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/must-gather-2fpch"] Apr 16 14:52:23.404605 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:52:23.404577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201647ee_d367_48da_bb1d_4e20082bc52a.slice/crio-2c4dfdaf84863d20e800e12838ea419c3371a2f43abf31c96e9787d246fed4a0 WatchSource:0}: Error finding container 2c4dfdaf84863d20e800e12838ea419c3371a2f43abf31c96e9787d246fed4a0: Status 404 returned error can't find the container with id 2c4dfdaf84863d20e800e12838ea419c3371a2f43abf31c96e9787d246fed4a0 Apr 16 14:52:23.406565 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.406528 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:52:23.915804 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:23.915760 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/must-gather-2fpch" event={"ID":"201647ee-d367-48da-bb1d-4e20082bc52a","Type":"ContainerStarted","Data":"2c4dfdaf84863d20e800e12838ea419c3371a2f43abf31c96e9787d246fed4a0"} Apr 16 14:52:24.923389 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:24.922504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/must-gather-2fpch" event={"ID":"201647ee-d367-48da-bb1d-4e20082bc52a","Type":"ContainerStarted","Data":"ca6395c69da11073f280960906202b6a7167d00f073a0e155708be7b9f8be10f"} Apr 16 14:52:24.923389 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:24.922567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/must-gather-2fpch" event={"ID":"201647ee-d367-48da-bb1d-4e20082bc52a","Type":"ContainerStarted","Data":"c6839dfdc9bc4435f5ee875a67877f9fa603ce8340c8cf6207aad4359d4b09d2"} Apr 16 14:52:24.941695 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:24.941616 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mcxmz/must-gather-2fpch" podStartSLOduration=2.214810676 podStartE2EDuration="2.941595302s" podCreationTimestamp="2026-04-16 14:52:22 +0000 UTC" firstStartedPulling="2026-04-16 14:52:23.406686581 +0000 UTC m=+1359.233396977" lastFinishedPulling="2026-04-16 14:52:24.133471205 +0000 UTC m=+1359.960181603" observedRunningTime="2026-04-16 14:52:24.939989503 +0000 UTC m=+1360.766699920" watchObservedRunningTime="2026-04-16 14:52:24.941595302 +0000 UTC m=+1360.768305723" Apr 16 14:52:25.927223 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:25.927187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xnzwr_f35ee2aa-f1ac-4d97-bf10-86c8f12bc700/global-pull-secret-syncer/0.log" Apr 16 14:52:26.013128 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:26.013085 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mn4h5_3cf0a81b-ae4c-46b8-b16c-e93bd4e87102/konnectivity-agent/0.log" Apr 16 14:52:26.063509 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:26.063460 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-173.ec2.internal_a454133692b7d59775381b8452362b38/haproxy/0.log" Apr 16 14:52:29.967805 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:29.967766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj_2e6fd606-91a0-489e-967d-379339d856e8/extract/0.log" Apr 16 14:52:29.997440 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:29.997401 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj_2e6fd606-91a0-489e-967d-379339d856e8/util/0.log" Apr 16 14:52:30.027656 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.027605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595p6pj_2e6fd606-91a0-489e-967d-379339d856e8/pull/0.log" Apr 16 14:52:30.061508 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.061461 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h_375b213e-688f-4f46-83df-921c9b71892d/extract/0.log" Apr 16 14:52:30.087659 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.087629 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h_375b213e-688f-4f46-83df-921c9b71892d/util/0.log" Apr 16 14:52:30.114202 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.114168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qtl7h_375b213e-688f-4f46-83df-921c9b71892d/pull/0.log" Apr 16 14:52:30.150471 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.150442 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7_43fc5738-3bf5-475d-857f-1f781f2e55a6/extract/0.log" Apr 16 14:52:30.176301 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.176237 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7_43fc5738-3bf5-475d-857f-1f781f2e55a6/util/0.log" Apr 16 14:52:30.205225 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.205198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7328rl7_43fc5738-3bf5-475d-857f-1f781f2e55a6/pull/0.log" Apr 16 14:52:30.233400 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.233318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n_edf5b29e-5d29-47f7-9873-699371d62d3c/extract/0.log" Apr 16 14:52:30.262375 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.262347 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n_edf5b29e-5d29-47f7-9873-699371d62d3c/util/0.log" Apr 16 14:52:30.288177 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.288147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef12286n_edf5b29e-5d29-47f7-9873-699371d62d3c/pull/0.log" Apr 16 14:52:30.319780 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.319751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-7588c67995-wqwhq_18ba0e62-34a8-4ada-abe9-de9e4362ea15/authorino/0.log" Apr 16 14:52:30.426369 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:30.426341 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-r8lmr_bb657765-3944-4f5d-b443-7325b472bdc7/kuadrant-console-plugin/0.log" Apr 16 14:52:32.086738 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.086703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-985g6_9e2f0c20-f27f-4dee-a225-e224c0239ba4/cluster-monitoring-operator/0.log" Apr 16 14:52:32.253895 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.253793 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-npx5k_c0e272d7-9472-49b2-bfae-f11a54b2ad97/monitoring-plugin/0.log" Apr 16 14:52:32.288891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.288845 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmxww_053bb478-a29a-4c10-8bab-0896952ba633/node-exporter/0.log" Apr 16 14:52:32.318604 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.318564 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmxww_053bb478-a29a-4c10-8bab-0896952ba633/kube-rbac-proxy/0.log" Apr 16 14:52:32.349074 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.349048 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cmxww_053bb478-a29a-4c10-8bab-0896952ba633/init-textfile/0.log" Apr 16 14:52:32.555130 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.555042 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-t5bf8_cb2b182b-d0b0-43b9-b51f-f2ed68482a29/kube-rbac-proxy-main/0.log" Apr 16 14:52:32.591827 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.591795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-t5bf8_cb2b182b-d0b0-43b9-b51f-f2ed68482a29/kube-rbac-proxy-self/0.log" Apr 16 14:52:32.620371 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.620333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-t5bf8_cb2b182b-d0b0-43b9-b51f-f2ed68482a29/openshift-state-metrics/0.log" Apr 16 14:52:32.881567 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.881513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-swf4k_4f1aa63b-175e-4630-a754-83cfe1a1942b/prometheus-operator/0.log" Apr 16 14:52:32.907485 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.907450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-swf4k_4f1aa63b-175e-4630-a754-83cfe1a1942b/kube-rbac-proxy/0.log" Apr 16 14:52:32.936197 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:32.936164 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-4w2j2_e883d262-4ffe-42cc-847d-aacbe36bf9c5/prometheus-operator-admission-webhook/0.log" Apr 16 14:52:33.065994 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:33.065968 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6db4656d8f-fq84c_4cc933c7-84a7-44d3-85d7-7dd10e8c29e7/thanos-query/0.log" Apr 16 14:52:33.095331 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:33.095291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6db4656d8f-fq84c_4cc933c7-84a7-44d3-85d7-7dd10e8c29e7/kube-rbac-proxy-web/0.log" Apr 16 14:52:33.123874 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:33.123846 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6db4656d8f-fq84c_4cc933c7-84a7-44d3-85d7-7dd10e8c29e7/kube-rbac-proxy/0.log" Apr 16 14:52:33.148625 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:33.148525 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6db4656d8f-fq84c_4cc933c7-84a7-44d3-85d7-7dd10e8c29e7/prom-label-proxy/0.log" Apr 16 14:52:33.173246 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:33.173214 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6db4656d8f-fq84c_4cc933c7-84a7-44d3-85d7-7dd10e8c29e7/kube-rbac-proxy-rules/0.log" Apr 16 14:52:33.206213 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:33.206183 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6db4656d8f-fq84c_4cc933c7-84a7-44d3-85d7-7dd10e8c29e7/kube-rbac-proxy-metrics/0.log" Apr 16 14:52:34.625424 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.625388 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v"] Apr 16 14:52:34.632763 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.632733 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.638993 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.638962 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v"] Apr 16 14:52:34.813201 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.813164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-sys\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.813380 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.813209 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-proc\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.813380 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.813298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-podres\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.813380 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.813358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-lib-modules\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.813502 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.813408 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6jc9\" (UniqueName: \"kubernetes.io/projected/7bd88eaf-edea-461d-a57a-96e943352c2a-kube-api-access-m6jc9\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.914020 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.913934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6jc9\" (UniqueName: \"kubernetes.io/projected/7bd88eaf-edea-461d-a57a-96e943352c2a-kube-api-access-m6jc9\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.914370 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.914349 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-sys\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.914828 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.914806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-proc\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.914938 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.914911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-podres\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.915004 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.914976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-lib-modules\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.915060 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.914582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-sys\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.915127 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.915105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-proc\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.915199 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.915135 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-lib-modules\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.915259 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.915202 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7bd88eaf-edea-461d-a57a-96e943352c2a-podres\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.925151 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.925115 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6jc9\" (UniqueName: \"kubernetes.io/projected/7bd88eaf-edea-461d-a57a-96e943352c2a-kube-api-access-m6jc9\") pod \"perf-node-gather-daemonset-94n5v\" (UID: \"7bd88eaf-edea-461d-a57a-96e943352c2a\") " pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:34.947190 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:34.947149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:35.111891 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:35.111850 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v"] Apr 16 14:52:35.116095 ip-10-0-128-173 kubenswrapper[2576]: W0416 14:52:35.116061 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7bd88eaf_edea_461d_a57a_96e943352c2a.slice/crio-eace05e061caad335f19733c1a504d1fefda6dd7ac8dce540d562b30cf7ce794 WatchSource:0}: Error finding container eace05e061caad335f19733c1a504d1fefda6dd7ac8dce540d562b30cf7ce794: Status 404 returned error can't find the container with id eace05e061caad335f19733c1a504d1fefda6dd7ac8dce540d562b30cf7ce794 Apr 16 14:52:35.984341 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:35.984264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" event={"ID":"7bd88eaf-edea-461d-a57a-96e943352c2a","Type":"ContainerStarted","Data":"2389fa870e4d09938324a15a38546eac6334c177c78fcf7f54d2dbe2524e6171"} Apr 16 14:52:35.984341 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:35.984312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" event={"ID":"7bd88eaf-edea-461d-a57a-96e943352c2a","Type":"ContainerStarted","Data":"eace05e061caad335f19733c1a504d1fefda6dd7ac8dce540d562b30cf7ce794"} Apr 16 14:52:35.984896 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:35.984754 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:36.002330 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:36.002267 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" podStartSLOduration=2.002247879 podStartE2EDuration="2.002247879s" podCreationTimestamp="2026-04-16 14:52:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:52:36.001339851 +0000 UTC m=+1371.828050298" watchObservedRunningTime="2026-04-16 14:52:36.002247879 +0000 UTC m=+1371.828958296" Apr 16 14:52:36.853305 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:36.853274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-99g2n_c18fae62-a7e8-473e-a375-928abb494bf2/dns/0.log" Apr 16 14:52:36.878163 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:36.878127 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-99g2n_c18fae62-a7e8-473e-a375-928abb494bf2/kube-rbac-proxy/0.log" Apr 16 14:52:37.007340 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:37.007314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-x7zbw_6079e191-6467-414e-9381-0ffd91e44ab4/dns-node-resolver/0.log" Apr 16 14:52:37.476561 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:37.476513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7759f9cdf-skkws_426b5a99-d698-419c-b42f-63c90eadfa2b/registry/0.log" Apr 16 14:52:37.497409 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:37.497381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-29nkz_1491249e-0f9d-4121-abb3-d99d4022d023/node-ca/0.log" Apr 16 14:52:38.554807 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:38.554776 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6468c4fc75-pjxlq_d568b9f7-2358-4b4c-8148-a6a3380124ab/kube-auth-proxy/0.log" Apr 16 14:52:38.614554 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:38.614509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7f7fd7cc4b-cbgrm_93dee602-e658-4746-861c-192789af4e9c/router/0.log" Apr 16 14:52:39.206911 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:39.206881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2tkjc_8e9a80d4-54ef-459f-b21f-e013f853f63d/serve-healthcheck-canary/0.log" Apr 16 14:52:39.694235 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:39.694203 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-cnkw9_05104d65-51f0-434e-945d-2714c95daadd/insights-operator/1.log" Apr 16 14:52:39.700030 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:39.700005 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-cnkw9_05104d65-51f0-434e-945d-2714c95daadd/insights-operator/0.log" Apr 16 14:52:39.871474 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:39.871441 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8fb7x_eb1c882f-bfac-4890-a59f-3f3d81e30081/kube-rbac-proxy/0.log" Apr 16 14:52:39.895154 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:39.895125 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8fb7x_eb1c882f-bfac-4890-a59f-3f3d81e30081/exporter/0.log" Apr 16 14:52:39.921786 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:39.921754 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8fb7x_eb1c882f-bfac-4890-a59f-3f3d81e30081/extractor/0.log" Apr 16 14:52:41.901126 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:41.901094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-54cfdc6dcf-pv79d_39383956-71e4-486e-bd58-6751cd811608/manager/0.log" Apr 16 14:52:41.958986 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:41.958958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6jdtr_8a2890cc-c938-4ec0-ac01-7eea814cd68f/manager/1.log" Apr 16 14:52:41.963659 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:41.963634 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-6jdtr_8a2890cc-c938-4ec0-ac01-7eea814cd68f/manager/2.log" Apr 16 14:52:41.999442 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:41.999414 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mcxmz/perf-node-gather-daemonset-94n5v" Apr 16 14:52:42.108002 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:42.107975 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6f7bb56bb6-j99db_c50824db-3db3-4978-9e36-c8ee6ff1c646/manager/0.log" Apr 16 14:52:43.516967 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:43.516932 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-76cd85c697-vngk7_fcae517d-7d7b-479d-a8fc-97e5bc02022b/manager/0.log" Apr 16 14:52:43.560014 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:43.559973 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-45br4_7e8f785f-9735-4841-b782-7ff0ca25110a/openshift-lws-operator/0.log" Apr 16 14:52:48.819066 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:48.819033 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-d8v8k_d1ee0bbf-24fd-4084-b536-8d53b20944b9/kube-storage-version-migrator-operator/1.log" Apr 16 14:52:48.820071 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:48.820050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-d8v8k_d1ee0bbf-24fd-4084-b536-8d53b20944b9/kube-storage-version-migrator-operator/0.log" Apr 16 14:52:49.897547 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:49.897498 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7cn8b_efcece22-1582-4835-82c7-1489ad265dca/kube-multus/0.log" Apr 16 14:52:50.329716 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.329644 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/kube-multus-additional-cni-plugins/0.log" Apr 16 14:52:50.353624 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.353592 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/egress-router-binary-copy/0.log" Apr 16 14:52:50.382583 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.382557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/cni-plugins/0.log" Apr 16 14:52:50.409010 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.408981 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/bond-cni-plugin/0.log" Apr 16 14:52:50.434629 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.434597 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/routeoverride-cni/0.log" Apr 16 14:52:50.457394 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.457367 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/whereabouts-cni-bincopy/0.log" Apr 16 14:52:50.480813 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.480782 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vnzc4_0a9bf7c4-4e80-4445-84c8-06bf12cc01b8/whereabouts-cni/0.log" Apr 16 14:52:50.617550 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.617493 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbtb7_bfa07533-d734-4829-bde0-6c0327bd79a9/network-metrics-daemon/0.log" Apr 16 14:52:50.641629 ip-10-0-128-173 kubenswrapper[2576]: I0416 14:52:50.641603 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kbtb7_bfa07533-d734-4829-bde0-6c0327bd79a9/kube-rbac-proxy/0.log"