Apr 22 19:06:31.501073 ip-10-0-134-22 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:06:31.964876 ip-10-0-134-22 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:31.964876 ip-10-0-134-22 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:06:31.964876 ip-10-0-134-22 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:31.964876 ip-10-0-134-22 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:06:31.964876 ip-10-0-134-22 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:06:31.969323 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.969232 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974829 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974846 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974850 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974854 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974857 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974860 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:31.974853 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974863 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974866 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974869 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974872 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974874 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974877 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974879 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974882 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974885 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974887 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974890 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974892 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974895 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974897 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974899 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974902 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974905 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974907 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974910 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974912 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:31.975082 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974915 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974917 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974920 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974923 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974927 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974932 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974935 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974938 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974941 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974944 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974946 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974950 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974952 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974955 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974958 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974961 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974964 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974966 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974969 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:31.975556 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974972 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974974 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974978 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974980 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974983 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974985 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974988 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974991 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974994 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974997 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.974999 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975002 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975005 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975007 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975010 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975012 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975014 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975017 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975019 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975022 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:31.976040 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975024 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975027 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975030 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975032 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975035 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975037 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975039 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975043 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975046 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975048 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975051 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975053 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975056 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975059 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975062 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975064 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975068 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975070 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975074 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:31.976517 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975078 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975080 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975513 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975519 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975522 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975525 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975528 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975531 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975533 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975536 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975539 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975541 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975544 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975547 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975549 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975552 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975554 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975557 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975560 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975563 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:31.976989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975566 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975570 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975574 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975577 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975580 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975584 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975592 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975595 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975598 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975600 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975603 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975605 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975608 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975611 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975613 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975615 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975618 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975620 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975622 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:31.977467 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975624 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975627 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975630 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975632 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975635 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975638 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975641 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975643 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975645 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975648 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975650 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975653 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975655 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975659 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975661 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975664 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975666 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975668 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975671 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975673 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:31.977989 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975676 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975678 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975681 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975683 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975685 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975688 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975690 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975694 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975697 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975699 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975702 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975705 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975707 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975710 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975712 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975715 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975717 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975720 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975722 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975725 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:31.978550 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975727 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975730 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975732 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975734 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975737 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975739 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975741 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975744 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.975746 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977129 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977144 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977152 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977156 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977161 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977164 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977170 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977174 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977177 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977180 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977184 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977187 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977190 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:06:31.979086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977193 2569 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977196 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977199 2569 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977202 2569 flags.go:64] FLAG: --cloud-config="" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977205 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977207 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977211 2569 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977213 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977216 2569 flags.go:64] FLAG: --config-dir="" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977219 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977223 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977227 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977230 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977232 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977236 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977239 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977241 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977244 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977247 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977250 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977255 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977258 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977260 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977263 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977267 2569 flags.go:64] FLAG: --enable-server="true" Apr 22 19:06:31.979618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977269 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977274 2569 flags.go:64] FLAG: --event-burst="100" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977277 2569 flags.go:64] FLAG: --event-qps="50" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977280 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977283 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977287 2569 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977291 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977294 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977297 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977299 2569 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977302 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977306 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977309 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977312 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977314 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977317 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977320 2569 flags.go:64] FLAG: --feature-gates="" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977324 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977327 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977330 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977334 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977337 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977340 2569 flags.go:64] FLAG: --help="false" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977342 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-134-22.ec2.internal" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977345 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:06:31.980258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977348 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977351 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977355 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977358 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977360 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977363 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977366 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977369 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977372 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977375 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977378 2569 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977381 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977384 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977387 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977390 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977392 2569 flags.go:64] FLAG: --lock-file="" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977395 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977398 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977401 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977406 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977409 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977412 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977415 2569 flags.go:64] FLAG: --logging-format="text" Apr 22 19:06:31.980878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977417 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977421 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977423 2569 flags.go:64] FLAG: --manifest-url="" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977429 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977434 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977437 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977441 2569 flags.go:64] FLAG: --max-pods="110" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977444 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977447 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977450 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977452 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977455 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977458 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977461 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977469 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977472 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977475 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977479 2569 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977482 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977487 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977490 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977493 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977496 2569 flags.go:64] FLAG: --port="10250" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977499 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:06:31.981432 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977502 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06069b5b452e69d3b" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977505 2569 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977508 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977511 2569 flags.go:64] FLAG: --register-node="true" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977514 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977517 2569 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977520 2569 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977523 2569 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977526 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977528 2569 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977532 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977540 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977543 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977546 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977549 2569 flags.go:64] FLAG: --runonce="false" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977551 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977554 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977557 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977560 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977562 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977565 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977569 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977572 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977575 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977578 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977581 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:06:31.982100 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977584 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977587 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977590 2569 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977593 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977598 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977601 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977604 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977607 2569 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977610 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977613 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977615 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977618 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977621 2569 flags.go:64] FLAG: --v="2" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977625 2569 flags.go:64] FLAG: --version="false" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977629 2569 flags.go:64] FLAG: --vmodule="" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977633 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.977637 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977733 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977737 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977740 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977743 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977745 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977748 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:31.982719 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977769 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977773 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977777 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977780 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977783 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977787 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977791 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977794 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977798 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977801 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977805 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977809 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977813 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977817 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977821 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977825 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977829 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977832 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977836 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977840 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:31.983429 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977843 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977847 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977852 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977856 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977859 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977863 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977869 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977873 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977877 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977881 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977885 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977888 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977894 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977901 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977909 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977915 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977919 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977923 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977927 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:31.983960 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977931 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977935 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977938 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977942 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977946 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977950 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977955 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977958 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977963 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977967 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977971 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977975 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977980 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977984 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977990 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977994 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.977998 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978002 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978007 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978014 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:31.984501 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978018 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978022 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978027 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978032 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978036 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978040 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978045 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978049 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978053 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978058 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978062 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978066 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978070 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978074 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978078 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978082 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978084 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978088 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978096 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978100 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:31.985364 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.978104 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:31.986214 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.979101 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:31.986464 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.986442 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:06:31.986523 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.986465 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986538 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986547 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986551 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986557 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986562 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986566 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:31.986568 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986570 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986577 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986583 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986588 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986591 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986596 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986600 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986604 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986609 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986613 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986617 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986621 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986625 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986641 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986646 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986650 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986655 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986659 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986663 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:31.987006 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986667 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986671 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986675 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986679 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986683 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986687 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986693 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986697 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986701 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986705 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986709 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986713 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986717 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986722 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986727 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986731 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986736 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986740 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986745 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986775 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:31.987819 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986780 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986784 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986787 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986791 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986795 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986801 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986806 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986811 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986816 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986819 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986823 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986827 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986831 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986834 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986838 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986841 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986845 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986849 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986853 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986858 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:31.988367 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986862 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986865 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986869 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986873 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986877 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986881 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986886 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986890 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986895 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986899 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986904 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986908 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986912 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986916 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986920 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986925 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986929 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986933 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986938 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:31.988954 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986942 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.986946 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.986954 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987164 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987175 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987180 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987184 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987188 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987192 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987196 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987201 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987205 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987209 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987214 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987219 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987223 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:06:31.989723 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987227 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987231 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987235 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987239 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987245 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987249 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987254 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987258 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987262 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987266 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987271 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987275 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987279 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987283 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987287 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987291 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987296 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987300 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987304 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987308 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:06:31.990418 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987313 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987317 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987321 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987324 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987329 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987333 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987337 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987341 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987346 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987350 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987355 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987359 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987363 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987370 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987376 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987381 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987386 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987391 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987396 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:06:31.991048 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987401 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987405 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987409 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987414 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987418 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987423 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987427 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987431 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987435 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987440 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987444 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987448 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987452 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987456 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987460 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987464 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987467 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987472 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987476 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987480 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:06:31.991633 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987484 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987488 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987492 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987496 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987501 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987505 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987511 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987518 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987522 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987529 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987533 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987537 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987541 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:31.987545 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.987553 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:06:31.992208 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.988340 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:06:31.992583 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.991106 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:06:31.992583 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.992113 2569 server.go:1019] "Starting client certificate rotation" Apr 22 19:06:31.992583 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.992214 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:06:31.992583 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:31.992252 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:06:32.019611 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.019559 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:06:32.022050 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.022031 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:06:32.037922 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.037893 2569 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:06:32.043456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.043436 2569 log.go:25] "Validated CRI v1 image API" Apr 22 19:06:32.044686 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.044654 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:06:32.049066 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.049042 2569 fs.go:135] Filesystem UUIDs: map[6e7a0af3-255d-41bc-bd66-1d4849c66515:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a15136e8-1afb-4b67-ab59-1cd55bc9482f:/dev/nvme0n1p3] Apr 22 19:06:32.049143 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.049065 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:06:32.052209 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.052187 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:06:32.055735 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.055610 2569 manager.go:217] Machine: {Timestamp:2026-04-22 19:06:32.053123159 +0000 UTC m=+0.418537989 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099468 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec276dc398109bbb0e6d869038959a03 SystemUUID:ec276dc3-9810-9bbb-0e6d-869038959a03 BootID:d5f1d549-b48d-4be9-904a-9a9d170b71ed Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3f:7a:7e:5c:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3f:7a:7e:5c:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:89:df:26:a3:a8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:06:32.055735 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.055727 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:06:32.055858 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.055850 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:06:32.056964 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.056932 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:06:32.057114 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.056965 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-22.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:06:32.057158 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.057124 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:06:32.057158 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.057132 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:06:32.057158 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.057145 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:06:32.057940 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.057928 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:06:32.059486 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.059476 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:06:32.059591 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.059582 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:06:32.061748 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.061738 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:06:32.061805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.061770 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:06:32.061805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.061783 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:06:32.061805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.061792 2569 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:06:32.061805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.061800 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:06:32.062868 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.062855 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:06:32.062920 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.062875 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:06:32.065810 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.065765 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:06:32.067132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.067116 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:06:32.068949 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.068934 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.068964 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.068974 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.068987 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.068997 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069006 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069015 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069023 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:06:32.069036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069033 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:06:32.069289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069043 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:06:32.069289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069072 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:06:32.069289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069086 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:06:32.069957 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069947 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:06:32.070002 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.069960 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:06:32.073412 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.073392 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qcdrc" Apr 22 19:06:32.073846 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.073831 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-22.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:06:32.073901 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.073882 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:06:32.073933 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.073884 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-22.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:06:32.074027 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.074014 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:06:32.074076 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.074060 2569 server.go:1295] "Started kubelet" Apr 22 19:06:32.074236 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.074202 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:06:32.074236 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.074194 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:06:32.074352 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.074260 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:06:32.076283 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.076258 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:06:32.076368 ip-10-0-134-22 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:06:32.078645 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.078629 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:06:32.081156 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.081135 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qcdrc" Apr 22 19:06:32.083023 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083004 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:06:32.083023 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083008 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:06:32.083728 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083709 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:06:32.083843 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083732 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:06:32.083843 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083781 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:06:32.083843 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.082993 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-22.ec2.internal.18a8c34dd65dc684 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-22.ec2.internal,UID:ip-10-0-134-22.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-22.ec2.internal,},FirstTimestamp:2026-04-22 19:06:32.074028676 +0000 UTC m=+0.439443505,LastTimestamp:2026-04-22 19:06:32.074028676 +0000 UTC m=+0.439443505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-22.ec2.internal,}" Apr 22 19:06:32.084011 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083949 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:06:32.084011 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.083958 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:06:32.084124 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084065 2569 factory.go:55] Registering systemd factory Apr 22 19:06:32.084124 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084103 2569 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:06:32.084283 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.084262 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.084364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084316 2569 factory.go:153] Registering CRI-O factory Apr 22 19:06:32.084364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084333 2569 factory.go:223] Registration of the crio container factory successfully Apr 22 19:06:32.084447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084385 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:06:32.084447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084422 2569 factory.go:103] Registering Raw factory Apr 22 19:06:32.084447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.084443 2569 manager.go:1196] Started watching for new ooms in manager Apr 22 19:06:32.085118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.085101 2569 manager.go:319] Starting recovery of all containers Apr 22 19:06:32.085686 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.085664 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 19:06:32.094380 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.094358 2569 manager.go:324] Recovery completed Apr 22 19:06:32.096877 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.096857 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:32.099260 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.099248 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:32.099520 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.099504 2569 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.101447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.101432 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:32.101519 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.101462 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:32.101519 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.101472 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:32.101925 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.101913 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:06:32.101972 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.101926 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:06:32.101972 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.101945 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:06:32.104440 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.104424 2569 policy_none.go:49] "None policy: Start" Apr 22 19:06:32.104440 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.104440 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:06:32.104731 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.104449 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:06:32.144385 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144356 2569 manager.go:341] "Starting Device Plugin manager" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.144420 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144440 2569 server.go:85] "Starting device plugin registration server" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144707 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144718 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144818 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144906 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.144914 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.145497 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:06:32.148863 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.145540 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.216396 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.216310 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:06:32.217959 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.217936 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:06:32.218057 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.217976 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:06:32.218057 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.218005 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:06:32.218057 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.218015 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:06:32.218187 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.218063 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:06:32.221249 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.221228 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:32.244862 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.244837 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:32.245932 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.245914 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:32.245986 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.245950 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:32.245986 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.245961 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:32.246046 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.245987 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.253671 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.253654 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.253733 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.253682 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-22.ec2.internal\": node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.271436 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.271402 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.318185 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.318151 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal"] Apr 22 19:06:32.318293 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.318249 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:32.319268 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.319250 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:32.319371 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.319284 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:32.319371 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.319297 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:32.321031 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321014 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:32.321160 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321143 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.321209 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321177 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:32.321767 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321736 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:32.321851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321740 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:32.321851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321805 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:32.321851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321818 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:32.321851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321779 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:32.322026 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.321856 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:32.322916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.322901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.322995 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.322932 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:06:32.323678 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.323662 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:06:32.323744 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.323686 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:06:32.323744 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.323696 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:06:32.348652 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.348620 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.353270 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.353250 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-22.ec2.internal\" not found" node="ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.371829 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.371784 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.472667 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.472591 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.487098 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.487069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.487098 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.487098 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.487259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.487117 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.573531 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.573499 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.587971 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.587929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.588048 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.587978 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.588048 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.587997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.588048 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.588038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b90ee820fd4186f1e6cd40d24ef3276-config\") pod \"kube-apiserver-proxy-ip-10-0-134-22.ec2.internal\" (UID: \"1b90ee820fd4186f1e6cd40d24ef3276\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.588159 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.588047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.588159 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.588053 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/f546dccbfe88d958c8bad79dd015e11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal\" (UID: \"f546dccbfe88d958c8bad79dd015e11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.651102 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.651073 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.655659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.655636 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:06:32.674364 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.674330 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.774820 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.774707 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.875276 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.875231 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.975969 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:32.975933 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:32.992436 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.992394 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:06:32.992577 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.992556 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:06:32.992614 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:32.992593 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:06:33.076460 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:33.076433 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-22.ec2.internal\" not found" Apr 22 19:06:33.082946 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.082909 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:01:32 +0000 UTC" deadline="2028-02-07 03:59:00.240075624 +0000 UTC" Apr 22 19:06:33.082946 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.082940 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15728h52m27.157139537s" Apr 22 19:06:33.083503 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.083485 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:06:33.085996 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.085968 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:33.094485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.094453 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:06:33.115246 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.115216 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rttq7" Apr 22 19:06:33.116906 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.116886 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:33.123910 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.123883 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rttq7" Apr 22 19:06:33.177815 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:33.177743 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b90ee820fd4186f1e6cd40d24ef3276.slice/crio-86cddbbf290863514d83643488b0e6fd10b75cd6358660226c755f4b5c68d31b WatchSource:0}: Error finding container 86cddbbf290863514d83643488b0e6fd10b75cd6358660226c755f4b5c68d31b: Status 404 returned error can't find the container with id 86cddbbf290863514d83643488b0e6fd10b75cd6358660226c755f4b5c68d31b Apr 22 19:06:33.178258 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:33.178234 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf546dccbfe88d958c8bad79dd015e11c.slice/crio-5b504b56628f1e10c988b8864b32aee37bdfbe94493e5edf30c3f5b78dba999e WatchSource:0}: Error finding container 5b504b56628f1e10c988b8864b32aee37bdfbe94493e5edf30c3f5b78dba999e: Status 404 returned error can't find the container with id 5b504b56628f1e10c988b8864b32aee37bdfbe94493e5edf30c3f5b78dba999e Apr 22 19:06:33.181693 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.181677 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:06:33.183390 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.183375 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" Apr 22 19:06:33.194635 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.194609 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:06:33.195621 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.195603 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" Apr 22 19:06:33.205053 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.205030 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:06:33.221190 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.221139 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"5b504b56628f1e10c988b8864b32aee37bdfbe94493e5edf30c3f5b78dba999e"} Apr 22 19:06:33.222154 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:33.222130 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" event={"ID":"1b90ee820fd4186f1e6cd40d24ef3276","Type":"ContainerStarted","Data":"86cddbbf290863514d83643488b0e6fd10b75cd6358660226c755f4b5c68d31b"} Apr 22 19:06:34.063581 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.063549 2569 apiserver.go:52] "Watching apiserver" Apr 22 19:06:34.078859 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.078632 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:06:34.080805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.080773 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zgnnh","openshift-multus/network-metrics-daemon-2r8qp","openshift-network-diagnostics/network-check-target-bc5ws","openshift-network-operator/iptables-alerter-vnq95","kube-system/konnectivity-agent-xjfbh","kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w","openshift-cluster-node-tuning-operator/tuned-rbn8q","openshift-dns/node-resolver-2wjct","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal","openshift-multus/multus-additional-cni-plugins-x7gv6","openshift-ovn-kubernetes/ovnkube-node-tfk46","openshift-image-registry/node-ca-hdwcg"] Apr 22 19:06:34.084295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.084267 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.086564 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.086540 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.086719 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.086620 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:34.086949 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.086928 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.087013 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.086931 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.087013 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.086992 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:06:34.087139 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.086999 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5zjtj\"" Apr 22 19:06:34.091364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.090935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:34.091364 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.091014 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:34.093171 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.093155 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.094657 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094562 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7lvc\" (UniqueName: \"kubernetes.io/projected/46a3468d-b017-471c-a0df-a07b1c183ff4-kube-api-access-n7lvc\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.094657 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094622 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.094657 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094649 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-socket-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.094851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094674 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-device-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.094851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094721 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-etc-selinux\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.094851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094782 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.094851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094811 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-registration-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.094851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-sys-fs\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.095035 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.094866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lblqx\" (UniqueName: \"kubernetes.io/projected/8e124a62-5e48-4542-b1f3-a08b56fc7221-kube-api-access-lblqx\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.095342 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.095304 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.095564 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.095547 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ddjjq\"" Apr 22 19:06:34.095837 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.095819 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.095913 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.095905 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.096041 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.096017 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:06:34.097571 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.097395 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:06:34.097571 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.097523 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.097799 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.097609 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nd968\"" Apr 22 19:06:34.097989 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.097963 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:06:34.099783 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.099531 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.100073 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.100046 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jp9nb\"" Apr 22 19:06:34.100197 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.100179 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.100429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.100400 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:06:34.100612 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.100592 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.100682 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.100649 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:06:34.102179 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.102162 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.102782 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.102614 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-546cr\"" Apr 22 19:06:34.103068 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.103048 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.103154 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.103145 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.103627 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.103148 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.105348 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.105332 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:06:34.105438 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.105363 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9rstv\"" Apr 22 19:06:34.105579 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.105563 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.105579 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.105575 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:06:34.107423 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.105782 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.107423 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.106288 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qwmh9\"" Apr 22 19:06:34.108645 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.108625 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.111041 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.109080 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.113745 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.113793 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.113880 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8j5vf\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.114254 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.114479 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.114708 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.114723 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.115147 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.115449 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.115463 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:06:34.116036 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.115596 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-zh7lz\"" Apr 22 19:06:34.124653 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.124628 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:01:33 +0000 UTC" deadline="2027-12-25 06:32:11.301280924 +0000 UTC" Apr 22 19:06:34.124653 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.124652 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14675h25m37.176632193s" Apr 22 19:06:34.144520 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.144488 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:34.185309 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.185279 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:06:34.195075 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195042 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-etc-kubernetes\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.195241 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195089 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lblqx\" (UniqueName: \"kubernetes.io/projected/8e124a62-5e48-4542-b1f3-a08b56fc7221-kube-api-access-lblqx\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.195241 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195114 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-os-release\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.195241 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195167 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-socket-dir-parent\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.195241 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f15bc14-85c1-4370-8e8c-dfc474a5636b-hosts-file\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.195429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195243 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-node-log\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.195429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195281 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-socket-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.195429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195307 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-modprobe-d\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.195429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195349 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysconfig\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.195429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195375 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-env-overrides\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.195429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61b5731d-8883-44c4-a6de-2a90288f2d58-ovn-node-metrics-cert\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.195661 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195437 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7lvc\" (UniqueName: \"kubernetes.io/projected/46a3468d-b017-471c-a0df-a07b1c183ff4-kube-api-access-n7lvc\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.195661 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195471 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-socket-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.195661 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195499 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-kubernetes\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.195661 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-kubelet\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.195841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195666 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ccd0ced6-c30a-4fe6-8fc3-356740ce7c61-konnectivity-ca\") pod \"konnectivity-agent-xjfbh\" (UID: \"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61\") " pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.195841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195692 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-conf-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.195841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-lib-modules\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.195841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195725 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-cni-netd\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.195841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.195841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195818 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-k8s-cni-cncf-io\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-hostroot\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195915 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-multus-certs\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195933 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-systemd\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195954 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-os-release\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.195978 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196003 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-systemd\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-ovn\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.196078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196073 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-host\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196088 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p9rr\" (UniqueName: \"kubernetes.io/projected/20f9a637-74d1-4828-8ab2-1bd5515e85ba-kube-api-access-2p9rr\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196110 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-var-lib-kubelet\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196131 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/538109ae-d500-4808-ad16-e32a5799d18d-etc-tuned\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196165 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-device-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8dd6df0f-e645-41a7-b974-0454616bb56e-cni-binary-copy\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-device-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196246 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qg4r\" (UniqueName: \"kubernetes.io/projected/538109ae-d500-4808-ad16-e32a5799d18d-kube-api-access-9qg4r\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196271 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-run-netns\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-system-cni-dir\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196343 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-registration-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-cni-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196414 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-daemon-config\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196437 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-host\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.196456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-registration-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ks2n\" (UniqueName: \"kubernetes.io/projected/2f15bc14-85c1-4370-8e8c-dfc474a5636b-kube-api-access-9ks2n\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196505 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-slash\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196520 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-var-lib-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-ovnkube-script-lib\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196549 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-cnibin\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-sys-fs\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196580 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysctl-d\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196612 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-log-socket\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-sys-fs\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-run-ovn-kubernetes\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196662 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-cni-bin\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196680 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-serviceca\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5v4c\" (UniqueName: \"kubernetes.io/projected/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-kube-api-access-m5v4c\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196721 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ccd0ced6-c30a-4fe6-8fc3-356740ce7c61-agent-certs\") pod \"konnectivity-agent-xjfbh\" (UID: \"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61\") " pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196738 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196774 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-cni-multus\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.197088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196800 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysctl-conf\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196820 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20f9a637-74d1-4828-8ab2-1bd5515e85ba-host-slash\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196836 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-kubelet\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-kubelet-dir\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196856 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq49x\" (UniqueName: \"kubernetes.io/projected/8dd6df0f-e645-41a7-b974-0454616bb56e-kube-api-access-cq49x\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196875 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-etc-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196891 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196917 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-netns\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196939 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-cni-bin\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196959 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-run\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.196980 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/538109ae-d500-4808-ad16-e32a5799d18d-tmp\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.196985 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197003 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197028 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.197040 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:34.69702442 +0000 UTC m=+3.062439234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197092 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-cnibin\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.197659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197135 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-system-cni-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197150 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-sys\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197165 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-systemd-units\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197183 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-ovnkube-config\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197208 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdxp9\" (UniqueName: \"kubernetes.io/projected/61b5731d-8883-44c4-a6de-2a90288f2d58-kube-api-access-jdxp9\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197224 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r87gx\" (UniqueName: \"kubernetes.io/projected/1e9b0a71-0187-42db-855a-762dfaa227aa-kube-api-access-r87gx\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197239 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/20f9a637-74d1-4828-8ab2-1bd5515e85ba-iptables-alerter-script\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197265 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-etc-selinux\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197285 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f15bc14-85c1-4370-8e8c-dfc474a5636b-tmp-dir\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.198211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.197459 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8e124a62-5e48-4542-b1f3-a08b56fc7221-etc-selinux\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.200589 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.200564 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:06:34.204174 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.204152 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lblqx\" (UniqueName: \"kubernetes.io/projected/8e124a62-5e48-4542-b1f3-a08b56fc7221-kube-api-access-lblqx\") pod \"aws-ebs-csi-driver-node-pnm2w\" (UID: \"8e124a62-5e48-4542-b1f3-a08b56fc7221\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.204276 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.204182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7lvc\" (UniqueName: \"kubernetes.io/projected/46a3468d-b017-471c-a0df-a07b1c183ff4-kube-api-access-n7lvc\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.212857 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.212836 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:06:34.297608 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297575 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-conf-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.297608 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-lib-modules\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297626 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-cni-netd\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297650 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297702 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-conf-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297773 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-cni-netd\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297816 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297823 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-lib-modules\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.297854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297827 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-k8s-cni-cncf-io\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297927 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-hostroot\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-multus-certs\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297978 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-multus-certs\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297983 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-k8s-cni-cncf-io\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.297996 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-systemd\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298008 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-hostroot\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298013 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-os-release\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-systemd\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-os-release\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298055 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.298132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298084 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-systemd\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298277 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-ovn\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298299 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-systemd\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298337 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-ovn\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-host\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298303 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-host\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298375 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p9rr\" (UniqueName: \"kubernetes.io/projected/20f9a637-74d1-4828-8ab2-1bd5515e85ba-kube-api-access-2p9rr\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298398 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-var-lib-kubelet\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/538109ae-d500-4808-ad16-e32a5799d18d-etc-tuned\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8dd6df0f-e645-41a7-b974-0454616bb56e-cni-binary-copy\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298470 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qg4r\" (UniqueName: \"kubernetes.io/projected/538109ae-d500-4808-ad16-e32a5799d18d-kube-api-access-9qg4r\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.298672 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298644 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298680 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298706 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-run-netns\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298737 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-system-cni-dir\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-var-lib-kubelet\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298799 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-run-netns\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-cni-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298845 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-daemon-config\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-host\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-system-cni-dir\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298888 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9ks2n\" (UniqueName: \"kubernetes.io/projected/2f15bc14-85c1-4370-8e8c-dfc474a5636b-kube-api-access-9ks2n\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298926 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-slash\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298933 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-host\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-var-lib-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298963 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-slash\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.298978 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-ovnkube-script-lib\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.299187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299002 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-cnibin\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299027 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysctl-d\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-log-socket\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-run-ovn-kubernetes\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-cni-bin\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299128 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-serviceca\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299151 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5v4c\" (UniqueName: \"kubernetes.io/projected/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-kube-api-access-m5v4c\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-cnibin\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299002 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-var-lib-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299174 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ccd0ced6-c30a-4fe6-8fc3-356740ce7c61-agent-certs\") pod \"konnectivity-agent-xjfbh\" (UID: \"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61\") " pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8dd6df0f-e645-41a7-b974-0454616bb56e-cni-binary-copy\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-cni-multus\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299243 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysctl-conf\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299253 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-cni-bin\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299249 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysctl-d\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299270 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20f9a637-74d1-4828-8ab2-1bd5515e85ba-host-slash\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-kubelet\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299300 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-cni-multus\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300118 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-run-ovn-kubernetes\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299321 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq49x\" (UniqueName: \"kubernetes.io/projected/8dd6df0f-e645-41a7-b974-0454616bb56e-kube-api-access-cq49x\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299351 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-etc-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299388 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-netns\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299413 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-cni-bin\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299437 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-run\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299462 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/538109ae-d500-4808-ad16-e32a5799d18d-tmp\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299475 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-daemon-config\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299485 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299520 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-run-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299535 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-cnibin\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-system-cni-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299570 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-sys\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299575 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-run-netns\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-systemd-units\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-ovnkube-config\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299606 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-cni-bin\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.300960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299621 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdxp9\" (UniqueName: \"kubernetes.io/projected/61b5731d-8883-44c4-a6de-2a90288f2d58-kube-api-access-jdxp9\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299624 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysctl-conf\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299638 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r87gx\" (UniqueName: \"kubernetes.io/projected/1e9b0a71-0187-42db-855a-762dfaa227aa-kube-api-access-r87gx\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299649 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-run\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/20f9a637-74d1-4828-8ab2-1bd5515e85ba-iptables-alerter-script\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299676 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/20f9a637-74d1-4828-8ab2-1bd5515e85ba-host-slash\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299680 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f15bc14-85c1-4370-8e8c-dfc474a5636b-tmp-dir\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299717 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-etc-kubernetes\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299748 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-os-release\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299806 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-socket-dir-parent\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299834 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f15bc14-85c1-4370-8e8c-dfc474a5636b-hosts-file\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299859 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-node-log\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-modprobe-d\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysconfig\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299944 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-env-overrides\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299967 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61b5731d-8883-44c4-a6de-2a90288f2d58-ovn-node-metrics-cert\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299977 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e9b0a71-0187-42db-855a-762dfaa227aa-cni-binary-copy\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.301780 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-kubernetes\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299554 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-etc-openvswitch\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300060 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-host-var-lib-kubelet\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300061 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-kubelet\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300057 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f15bc14-85c1-4370-8e8c-dfc474a5636b-tmp-dir\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300088 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-sys\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300107 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-system-cni-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300109 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ccd0ced6-c30a-4fe6-8fc3-356740ce7c61-konnectivity-ca\") pod \"konnectivity-agent-xjfbh\" (UID: \"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61\") " pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.299689 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-serviceca\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-systemd-units\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300170 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-log-socket\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300173 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-ovnkube-script-lib\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f15bc14-85c1-4370-8e8c-dfc474a5636b-hosts-file\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300221 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-socket-dir-parent\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e9b0a71-0187-42db-855a-762dfaa227aa-cnibin\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-modprobe-d\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300267 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-sysconfig\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.302485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300301 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-os-release\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300320 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-node-log\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300668 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-env-overrides\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300714 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/61b5731d-8883-44c4-a6de-2a90288f2d58-host-kubelet\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300779 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-etc-kubernetes\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300828 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/538109ae-d500-4808-ad16-e32a5799d18d-etc-kubernetes\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.300857 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ccd0ced6-c30a-4fe6-8fc3-356740ce7c61-konnectivity-ca\") pod \"konnectivity-agent-xjfbh\" (UID: \"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61\") " pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.301133 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61b5731d-8883-44c4-a6de-2a90288f2d58-ovnkube-config\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.301303 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/20f9a637-74d1-4828-8ab2-1bd5515e85ba-iptables-alerter-script\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.301701 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8dd6df0f-e645-41a7-b974-0454616bb56e-multus-cni-dir\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.302363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ccd0ced6-c30a-4fe6-8fc3-356740ce7c61-agent-certs\") pod \"konnectivity-agent-xjfbh\" (UID: \"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61\") " pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.302577 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/538109ae-d500-4808-ad16-e32a5799d18d-tmp\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.303195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.302708 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61b5731d-8883-44c4-a6de-2a90288f2d58-ovn-node-metrics-cert\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.303649 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.303228 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/538109ae-d500-4808-ad16-e32a5799d18d-etc-tuned\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.304728 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.304703 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:34.304728 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.304729 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:34.304917 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.304742 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:34.304917 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.304827 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:34.804808089 +0000 UTC m=+3.170222903 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:34.307149 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.307124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qg4r\" (UniqueName: \"kubernetes.io/projected/538109ae-d500-4808-ad16-e32a5799d18d-kube-api-access-9qg4r\") pod \"tuned-rbn8q\" (UID: \"538109ae-d500-4808-ad16-e32a5799d18d\") " pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.307459 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.307412 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ks2n\" (UniqueName: \"kubernetes.io/projected/2f15bc14-85c1-4370-8e8c-dfc474a5636b-kube-api-access-9ks2n\") pod \"node-resolver-2wjct\" (UID: \"2f15bc14-85c1-4370-8e8c-dfc474a5636b\") " pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.308680 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.308641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p9rr\" (UniqueName: \"kubernetes.io/projected/20f9a637-74d1-4828-8ab2-1bd5515e85ba-kube-api-access-2p9rr\") pod \"iptables-alerter-vnq95\" (UID: \"20f9a637-74d1-4828-8ab2-1bd5515e85ba\") " pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.308820 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.308711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5v4c\" (UniqueName: \"kubernetes.io/projected/8ab2b075-d3d5-4d3a-848e-89344c4f11b6-kube-api-access-m5v4c\") pod \"node-ca-hdwcg\" (UID: \"8ab2b075-d3d5-4d3a-848e-89344c4f11b6\") " pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.309127 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.309105 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r87gx\" (UniqueName: \"kubernetes.io/projected/1e9b0a71-0187-42db-855a-762dfaa227aa-kube-api-access-r87gx\") pod \"multus-additional-cni-plugins-x7gv6\" (UID: \"1e9b0a71-0187-42db-855a-762dfaa227aa\") " pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.310729 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.310709 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq49x\" (UniqueName: \"kubernetes.io/projected/8dd6df0f-e645-41a7-b974-0454616bb56e-kube-api-access-cq49x\") pod \"multus-zgnnh\" (UID: \"8dd6df0f-e645-41a7-b974-0454616bb56e\") " pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.311057 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.311033 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdxp9\" (UniqueName: \"kubernetes.io/projected/61b5731d-8883-44c4-a6de-2a90288f2d58-kube-api-access-jdxp9\") pod \"ovnkube-node-tfk46\" (UID: \"61b5731d-8883-44c4-a6de-2a90288f2d58\") " pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.397893 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.397812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" Apr 22 19:06:34.414711 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.414676 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-vnq95" Apr 22 19:06:34.422434 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.422408 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:34.428138 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.428118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zgnnh" Apr 22 19:06:34.433722 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.433698 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2wjct" Apr 22 19:06:34.440340 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.440316 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" Apr 22 19:06:34.446920 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.446896 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" Apr 22 19:06:34.453503 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.453478 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hdwcg" Apr 22 19:06:34.458193 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.458170 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:34.702275 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.702234 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:34.702457 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.702399 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:34.702457 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.702455 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:35.702440339 +0000 UTC m=+4.067855153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:34.904708 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:34.904670 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:34.904901 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.904853 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:34.904901 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.904876 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:34.904901 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.904888 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:34.905036 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:34.904948 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:35.904931468 +0000 UTC m=+4.270346281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:34.931780 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.931709 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b5731d_8883_44c4_a6de_2a90288f2d58.slice/crio-c2bf1758b0cb9aff08bb9d007886e9a764c3ddac7bda2d2a30a36cb4f5f3e56b WatchSource:0}: Error finding container c2bf1758b0cb9aff08bb9d007886e9a764c3ddac7bda2d2a30a36cb4f5f3e56b: Status 404 returned error can't find the container with id c2bf1758b0cb9aff08bb9d007886e9a764c3ddac7bda2d2a30a36cb4f5f3e56b Apr 22 19:06:34.933684 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.933655 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e124a62_5e48_4542_b1f3_a08b56fc7221.slice/crio-432349ca3ef453808aef254af5c88f903ae8801ad1e36772def4a93ce07012b8 WatchSource:0}: Error finding container 432349ca3ef453808aef254af5c88f903ae8801ad1e36772def4a93ce07012b8: Status 404 returned error can't find the container with id 432349ca3ef453808aef254af5c88f903ae8801ad1e36772def4a93ce07012b8 Apr 22 19:06:34.936355 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.935684 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod538109ae_d500_4808_ad16_e32a5799d18d.slice/crio-9025d11a2c04f762cdbf1972e2a7eb98caec125dab8b5ee034e04865851b2a09 WatchSource:0}: Error finding container 9025d11a2c04f762cdbf1972e2a7eb98caec125dab8b5ee034e04865851b2a09: Status 404 returned error can't find the container with id 9025d11a2c04f762cdbf1972e2a7eb98caec125dab8b5ee034e04865851b2a09 Apr 22 19:06:34.936715 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.936675 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f15bc14_85c1_4370_8e8c_dfc474a5636b.slice/crio-f6682f29901b1b8a41d23d61c477be53651a0621f3b1c48f0063e12b4ad67aa4 WatchSource:0}: Error finding container f6682f29901b1b8a41d23d61c477be53651a0621f3b1c48f0063e12b4ad67aa4: Status 404 returned error can't find the container with id f6682f29901b1b8a41d23d61c477be53651a0621f3b1c48f0063e12b4ad67aa4 Apr 22 19:06:34.940104 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.938426 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e9b0a71_0187_42db_855a_762dfaa227aa.slice/crio-0365da6d7f9e7e7138012b5b04b1d9d9a648ad59dce20412ccc7bfd5102f54a1 WatchSource:0}: Error finding container 0365da6d7f9e7e7138012b5b04b1d9d9a648ad59dce20412ccc7bfd5102f54a1: Status 404 returned error can't find the container with id 0365da6d7f9e7e7138012b5b04b1d9d9a648ad59dce20412ccc7bfd5102f54a1 Apr 22 19:06:34.940104 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.939430 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ab2b075_d3d5_4d3a_848e_89344c4f11b6.slice/crio-04b7758a264552e218710f4062f49995d364bff1a8b98ff419d1d420df75ec37 WatchSource:0}: Error finding container 04b7758a264552e218710f4062f49995d364bff1a8b98ff419d1d420df75ec37: Status 404 returned error can't find the container with id 04b7758a264552e218710f4062f49995d364bff1a8b98ff419d1d420df75ec37 Apr 22 19:06:34.940270 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.940172 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccd0ced6_c30a_4fe6_8fc3_356740ce7c61.slice/crio-c223e94d9021b44d862fb5e23078c9d53f233f74ad8904ce210183d86a7f86fd WatchSource:0}: Error finding container c223e94d9021b44d862fb5e23078c9d53f233f74ad8904ce210183d86a7f86fd: Status 404 returned error can't find the container with id c223e94d9021b44d862fb5e23078c9d53f233f74ad8904ce210183d86a7f86fd Apr 22 19:06:34.942016 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.941448 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f9a637_74d1_4828_8ab2_1bd5515e85ba.slice/crio-489f568f3f0a1629ed7107e5c17bf1bdc9e0c34850f42f71e7099e21d6f7d098 WatchSource:0}: Error finding container 489f568f3f0a1629ed7107e5c17bf1bdc9e0c34850f42f71e7099e21d6f7d098: Status 404 returned error can't find the container with id 489f568f3f0a1629ed7107e5c17bf1bdc9e0c34850f42f71e7099e21d6f7d098 Apr 22 19:06:34.942845 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:06:34.942475 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd6df0f_e645_41a7_b974_0454616bb56e.slice/crio-e3850efb070a521d1420635ff16d40a09f46371dedf772d997de04b50c21729d WatchSource:0}: Error finding container e3850efb070a521d1420635ff16d40a09f46371dedf772d997de04b50c21729d: Status 404 returned error can't find the container with id e3850efb070a521d1420635ff16d40a09f46371dedf772d997de04b50c21729d Apr 22 19:06:35.125721 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.125486 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:01:33 +0000 UTC" deadline="2028-01-13 11:14:48.131768502 +0000 UTC" Apr 22 19:06:35.125721 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.125717 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15136h8m13.006055998s" Apr 22 19:06:35.219150 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.219028 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:35.219150 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.219046 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:35.219150 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.219135 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:35.219369 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.219249 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:35.226339 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.226312 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xjfbh" event={"ID":"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61","Type":"ContainerStarted","Data":"c223e94d9021b44d862fb5e23078c9d53f233f74ad8904ce210183d86a7f86fd"} Apr 22 19:06:35.227484 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.227462 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hdwcg" event={"ID":"8ab2b075-d3d5-4d3a-848e-89344c4f11b6","Type":"ContainerStarted","Data":"04b7758a264552e218710f4062f49995d364bff1a8b98ff419d1d420df75ec37"} Apr 22 19:06:35.228507 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.228485 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerStarted","Data":"0365da6d7f9e7e7138012b5b04b1d9d9a648ad59dce20412ccc7bfd5102f54a1"} Apr 22 19:06:35.229503 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.229450 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" event={"ID":"8e124a62-5e48-4542-b1f3-a08b56fc7221","Type":"ContainerStarted","Data":"432349ca3ef453808aef254af5c88f903ae8801ad1e36772def4a93ce07012b8"} Apr 22 19:06:35.230529 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.230510 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"c2bf1758b0cb9aff08bb9d007886e9a764c3ddac7bda2d2a30a36cb4f5f3e56b"} Apr 22 19:06:35.232066 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.232048 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" event={"ID":"1b90ee820fd4186f1e6cd40d24ef3276","Type":"ContainerStarted","Data":"ff5d207713ebf4f1549b60abe41b5b3e4a9148a763254ba0ef11de21cbd48668"} Apr 22 19:06:35.233194 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.233176 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zgnnh" event={"ID":"8dd6df0f-e645-41a7-b974-0454616bb56e","Type":"ContainerStarted","Data":"e3850efb070a521d1420635ff16d40a09f46371dedf772d997de04b50c21729d"} Apr 22 19:06:35.234147 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.234128 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wjct" event={"ID":"2f15bc14-85c1-4370-8e8c-dfc474a5636b","Type":"ContainerStarted","Data":"f6682f29901b1b8a41d23d61c477be53651a0621f3b1c48f0063e12b4ad67aa4"} Apr 22 19:06:35.235808 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.235780 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" event={"ID":"538109ae-d500-4808-ad16-e32a5799d18d","Type":"ContainerStarted","Data":"9025d11a2c04f762cdbf1972e2a7eb98caec125dab8b5ee034e04865851b2a09"} Apr 22 19:06:35.237330 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.237203 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vnq95" event={"ID":"20f9a637-74d1-4828-8ab2-1bd5515e85ba","Type":"ContainerStarted","Data":"489f568f3f0a1629ed7107e5c17bf1bdc9e0c34850f42f71e7099e21d6f7d098"} Apr 22 19:06:35.711169 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.710779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:35.711169 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.710948 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:35.711169 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.711058 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:37.710994934 +0000 UTC m=+6.076409753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:35.914065 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:35.913267 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:35.914065 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.913481 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:35.914065 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.913503 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:35.914065 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.913517 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:35.914065 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:35.913579 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:37.913560595 +0000 UTC m=+6.278975423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:36.262543 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:36.261395 2569 generic.go:358] "Generic (PLEG): container finished" podID="f546dccbfe88d958c8bad79dd015e11c" containerID="fa63ac42883217b862716005ca9ae2e6b1517450b226ea724ddfff64dc9b046a" exitCode=0 Apr 22 19:06:36.262543 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:36.262483 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerDied","Data":"fa63ac42883217b862716005ca9ae2e6b1517450b226ea724ddfff64dc9b046a"} Apr 22 19:06:36.278922 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:36.277912 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-22.ec2.internal" podStartSLOduration=3.277891006 podStartE2EDuration="3.277891006s" podCreationTimestamp="2026-04-22 19:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:35.258254666 +0000 UTC m=+3.623669501" watchObservedRunningTime="2026-04-22 19:06:36.277891006 +0000 UTC m=+4.643305842" Apr 22 19:06:37.218258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:37.218217 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:37.218429 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.218347 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:37.218493 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:37.218447 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:37.218556 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.218537 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:37.278883 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:37.278848 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" event={"ID":"f546dccbfe88d958c8bad79dd015e11c","Type":"ContainerStarted","Data":"bef3b990bd51835dce7347061147b74d25b2be19b92b4065e4261de4494ac488"} Apr 22 19:06:37.729538 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:37.729503 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:37.729772 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.729691 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:37.729772 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.729769 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:41.729733519 +0000 UTC m=+10.095148336 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:37.932096 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:37.931518 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:37.932096 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.931642 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:37.932096 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.931655 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:37.932096 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.931668 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:37.932096 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:37.931724 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:41.931706236 +0000 UTC m=+10.297121052 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:39.218940 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:39.218775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:39.218940 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:39.218912 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:39.219478 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:39.219091 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:39.219478 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:39.219170 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:41.218927 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:41.218892 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:41.219296 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.219022 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:41.219338 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:41.219328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:41.219410 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.219397 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:41.761296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:41.761261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:41.761452 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.761426 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:41.761517 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.761504 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:49.761483619 +0000 UTC m=+18.126898451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:41.963331 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:41.963292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:41.963521 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.963441 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:41.963521 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.963459 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:41.963521 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.963471 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:41.963677 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:41.963525 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:06:49.963507222 +0000 UTC m=+18.328922040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:43.218618 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:43.218577 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:43.219117 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:43.218708 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:43.219117 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:43.219100 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:43.219242 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:43.219187 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:45.218351 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:45.218317 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:45.218892 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:45.218317 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:45.218892 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:45.218468 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:45.218892 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:45.218514 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:47.218785 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:47.218740 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:47.219237 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:47.218740 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:47.219237 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:47.218921 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:47.219237 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:47.218936 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:49.218334 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:49.218296 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:49.218334 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:49.218321 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:49.218866 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:49.218405 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:49.218866 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:49.218551 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:49.826281 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:49.826244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:49.826472 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:49.826371 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:49.826472 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:49.826458 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:05.826437107 +0000 UTC m=+34.191851926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:06:50.027674 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:50.027636 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:50.027862 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:50.027820 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:06:50.027862 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:50.027835 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:06:50.027862 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:50.027848 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:50.028028 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:50.027914 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:06.027899641 +0000 UTC m=+34.393314456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:06:51.218882 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:51.218849 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:51.219288 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:51.218849 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:51.219288 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:51.218972 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:51.219288 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:51.219088 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:53.218873 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.218527 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:53.219489 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.218527 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:53.219489 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:53.218970 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:53.219489 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:53.219067 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:53.308739 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.308654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xjfbh" event={"ID":"ccd0ced6-c30a-4fe6-8fc3-356740ce7c61","Type":"ContainerStarted","Data":"257b725225bcac90bd782f9bf10642bc779b046a4b117931d59e683e87f360a9"} Apr 22 19:06:53.310044 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.310020 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hdwcg" event={"ID":"8ab2b075-d3d5-4d3a-848e-89344c4f11b6","Type":"ContainerStarted","Data":"6f9ea60bf262ccc13946df273822849e3b78e11746561ef265263b653056ed0a"} Apr 22 19:06:53.311393 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.311366 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e9b0a71-0187-42db-855a-762dfaa227aa" containerID="0c71c5efc259b4a5d6db7a9f232c26992bff5be0f03ebdfc2e0e7c4ffa140c71" exitCode=0 Apr 22 19:06:53.311467 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.311448 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerDied","Data":"0c71c5efc259b4a5d6db7a9f232c26992bff5be0f03ebdfc2e0e7c4ffa140c71"} Apr 22 19:06:53.312863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.312807 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" event={"ID":"8e124a62-5e48-4542-b1f3-a08b56fc7221","Type":"ContainerStarted","Data":"14623b67e8dc42a1a275559a367e1ab62e8d19a2feda8fba58178d8896550871"} Apr 22 19:06:53.315344 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315320 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:06:53.315745 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315722 2569 generic.go:358] "Generic (PLEG): container finished" podID="61b5731d-8883-44c4-a6de-2a90288f2d58" containerID="e09c1025690799b39727d0efc211e417c1b55699b1fca9bff100c0a96d2d1b11" exitCode=1 Apr 22 19:06:53.315859 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315809 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"5fb7882fd51377023536fefee3e13833fdd504d055167e271ea0bf09970a8212"} Apr 22 19:06:53.315859 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315839 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"ef2722b6afd4f033756340cd704be2b7875f8e9245495e7a48bee1c0675c5026"} Apr 22 19:06:53.315859 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"1dfedb1269a649ddbf074d8f75b76bc90d12c04c74529a3fd37ec164c860814b"} Apr 22 19:06:53.315972 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315878 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"3daa779ff1425328e4c2cf9803d1b41df9e5ab887024823e53c0302bdcdd8e82"} Apr 22 19:06:53.315972 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315891 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerDied","Data":"e09c1025690799b39727d0efc211e417c1b55699b1fca9bff100c0a96d2d1b11"} Apr 22 19:06:53.315972 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.315905 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"1cb6a6f6df61f1fb2cda6c46a23e02f7acbd475cd603d35c290a04c9bd4177a8"} Apr 22 19:06:53.317179 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.317156 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zgnnh" event={"ID":"8dd6df0f-e645-41a7-b974-0454616bb56e","Type":"ContainerStarted","Data":"331d3541bc112f786347fa016777ee02aa4e88bdffef001b20ede12b90beb11a"} Apr 22 19:06:53.318648 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.318628 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2wjct" event={"ID":"2f15bc14-85c1-4370-8e8c-dfc474a5636b","Type":"ContainerStarted","Data":"b2a7719e390f674e4358867c0ecadd17d646465ca6785522bc5ff8abacf61e88"} Apr 22 19:06:53.320144 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.320121 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" event={"ID":"538109ae-d500-4808-ad16-e32a5799d18d","Type":"ContainerStarted","Data":"f698574a17473569132d29c134401cde7abbb3ca98e3df40a39e59970b3cbc68"} Apr 22 19:06:53.323449 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.323405 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xjfbh" podStartSLOduration=11.978652255 podStartE2EDuration="21.323394942s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.942330268 +0000 UTC m=+3.307745084" lastFinishedPulling="2026-04-22 19:06:44.287072951 +0000 UTC m=+12.652487771" observedRunningTime="2026-04-22 19:06:53.322912858 +0000 UTC m=+21.688327691" watchObservedRunningTime="2026-04-22 19:06:53.323394942 +0000 UTC m=+21.688809777" Apr 22 19:06:53.323655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.323635 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-22.ec2.internal" podStartSLOduration=20.32362916 podStartE2EDuration="20.32362916s" podCreationTimestamp="2026-04-22 19:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:06:37.295019929 +0000 UTC m=+5.660434766" watchObservedRunningTime="2026-04-22 19:06:53.32362916 +0000 UTC m=+21.689043998" Apr 22 19:06:53.340094 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.340054 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zgnnh" podStartSLOduration=3.821430527 podStartE2EDuration="21.340041815s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.944800018 +0000 UTC m=+3.310214832" lastFinishedPulling="2026-04-22 19:06:52.463411297 +0000 UTC m=+20.828826120" observedRunningTime="2026-04-22 19:06:53.33953855 +0000 UTC m=+21.704953390" watchObservedRunningTime="2026-04-22 19:06:53.340041815 +0000 UTC m=+21.705456652" Apr 22 19:06:53.353373 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.353333 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2wjct" podStartSLOduration=3.842159757 podStartE2EDuration="21.353319942s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.938970428 +0000 UTC m=+3.304385250" lastFinishedPulling="2026-04-22 19:06:52.450130605 +0000 UTC m=+20.815545435" observedRunningTime="2026-04-22 19:06:53.353073083 +0000 UTC m=+21.718487919" watchObservedRunningTime="2026-04-22 19:06:53.353319942 +0000 UTC m=+21.718734779" Apr 22 19:06:53.370893 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.370860 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rbn8q" podStartSLOduration=3.855489327 podStartE2EDuration="21.370850113s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.936874386 +0000 UTC m=+3.302289204" lastFinishedPulling="2026-04-22 19:06:52.452235163 +0000 UTC m=+20.817649990" observedRunningTime="2026-04-22 19:06:53.370361197 +0000 UTC m=+21.735776036" watchObservedRunningTime="2026-04-22 19:06:53.370850113 +0000 UTC m=+21.736264983" Apr 22 19:06:53.407131 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.407080 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hdwcg" podStartSLOduration=3.9416321500000002 podStartE2EDuration="21.407067153s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.941660441 +0000 UTC m=+3.307075261" lastFinishedPulling="2026-04-22 19:06:52.407095436 +0000 UTC m=+20.772510264" observedRunningTime="2026-04-22 19:06:53.406776235 +0000 UTC m=+21.772191070" watchObservedRunningTime="2026-04-22 19:06:53.407067153 +0000 UTC m=+21.772481988" Apr 22 19:06:53.697296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:53.697270 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:06:54.162960 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:54.162814 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:06:53.697290785Z","UUID":"25b34860-450e-4362-ba8e-5a1e774abf02","Handler":null,"Name":"","Endpoint":""} Apr 22 19:06:54.164915 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:54.164892 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:06:54.164915 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:54.164921 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:06:54.323694 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:54.323656 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-vnq95" event={"ID":"20f9a637-74d1-4828-8ab2-1bd5515e85ba","Type":"ContainerStarted","Data":"a1f0cf697c4b2202d150ae039b2d388070146dff51fddbeb316dbfeb3bf5ed5f"} Apr 22 19:06:54.330937 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:54.330895 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" event={"ID":"8e124a62-5e48-4542-b1f3-a08b56fc7221","Type":"ContainerStarted","Data":"7c90dd7f92f312e141dd604a830ccb2f7c1ba36454586f342da669c16cea665c"} Apr 22 19:06:54.344640 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:54.344598 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-vnq95" podStartSLOduration=4.838922889 podStartE2EDuration="22.344585834s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.944847495 +0000 UTC m=+3.310262315" lastFinishedPulling="2026-04-22 19:06:52.450510445 +0000 UTC m=+20.815925260" observedRunningTime="2026-04-22 19:06:54.344079217 +0000 UTC m=+22.709494054" watchObservedRunningTime="2026-04-22 19:06:54.344585834 +0000 UTC m=+22.710000681" Apr 22 19:06:55.219186 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:55.219155 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:55.219360 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:55.219266 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:55.219531 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:55.219155 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:55.219644 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:55.219619 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:55.334391 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:55.334357 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" event={"ID":"8e124a62-5e48-4542-b1f3-a08b56fc7221","Type":"ContainerStarted","Data":"619d0ae099fd9b6ff78d39daa8ad1d8869512a75af0a5a995f9af46c7f439735"} Apr 22 19:06:55.337379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:55.337357 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:06:55.338171 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:55.338148 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"e8a35c0d8b4ba03c5fc2eac5d57d977f478d223b3ff6154f21132ac4963c9f0e"} Apr 22 19:06:55.350187 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:55.350109 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-pnm2w" podStartSLOduration=3.701352774 podStartE2EDuration="23.350092069s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.93578538 +0000 UTC m=+3.301200196" lastFinishedPulling="2026-04-22 19:06:54.584524675 +0000 UTC m=+22.949939491" observedRunningTime="2026-04-22 19:06:55.35006208 +0000 UTC m=+23.715476915" watchObservedRunningTime="2026-04-22 19:06:55.350092069 +0000 UTC m=+23.715506901" Apr 22 19:06:57.143227 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:57.143027 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:57.143722 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:57.143709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:57.218937 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:57.218901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:57.219093 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:57.219035 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:57.219337 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:57.218911 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:57.219337 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:57.219297 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:57.340728 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:57.340696 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:57.341191 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:57.341170 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xjfbh" Apr 22 19:06:58.344020 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.343824 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e9b0a71-0187-42db-855a-762dfaa227aa" containerID="aedb65e8f7487052217046e6feb64648b0d1423c7bd26c6a57810028ba670994" exitCode=0 Apr 22 19:06:58.344733 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.343916 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerDied","Data":"aedb65e8f7487052217046e6feb64648b0d1423c7bd26c6a57810028ba670994"} Apr 22 19:06:58.347135 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.347114 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:06:58.347514 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.347486 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"6ed43101af66156a9a66482a25b6f541d755f5281730915da916fcd9f44a82f4"} Apr 22 19:06:58.347745 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.347728 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:58.347901 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.347776 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:58.347901 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.347789 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:58.347966 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.347913 2569 scope.go:117] "RemoveContainer" containerID="e09c1025690799b39727d0efc211e417c1b55699b1fca9bff100c0a96d2d1b11" Apr 22 19:06:58.363262 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.363242 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:58.364463 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:58.364445 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:06:59.218921 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.218891 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:59.219080 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:59.218997 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:06:59.219080 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.219057 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:59.219194 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:59.219161 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:59.353202 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.353176 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:06:59.353614 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.353588 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" event={"ID":"61b5731d-8883-44c4-a6de-2a90288f2d58","Type":"ContainerStarted","Data":"1e3a8530a10326ea55935f6b9ff45745bb972aae40763e785a238003528d01fa"} Apr 22 19:06:59.385339 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.385295 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" podStartSLOduration=9.796892247 podStartE2EDuration="27.385280019s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.933537821 +0000 UTC m=+3.298952638" lastFinishedPulling="2026-04-22 19:06:52.521925593 +0000 UTC m=+20.887340410" observedRunningTime="2026-04-22 19:06:59.383289885 +0000 UTC m=+27.748704744" watchObservedRunningTime="2026-04-22 19:06:59.385280019 +0000 UTC m=+27.750694854" Apr 22 19:06:59.606577 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.606539 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bc5ws"] Apr 22 19:06:59.606726 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.606683 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:06:59.606850 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:59.606812 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:06:59.609418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.609365 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2r8qp"] Apr 22 19:06:59.609523 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:06:59.609510 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:06:59.609652 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:06:59.609625 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:07:00.356388 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:00.356354 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e9b0a71-0187-42db-855a-762dfaa227aa" containerID="495babc793f1cd35107f4fbb56329cb1baea1733e6d2dfc8cb3bcdcd57c6e262" exitCode=0 Apr 22 19:07:00.356795 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:00.356438 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerDied","Data":"495babc793f1cd35107f4fbb56329cb1baea1733e6d2dfc8cb3bcdcd57c6e262"} Apr 22 19:07:01.219138 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:01.218963 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:07:01.219270 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:01.218970 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:01.219270 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:01.219231 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:07:01.219350 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:01.219309 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:07:02.361327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:02.361230 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e9b0a71-0187-42db-855a-762dfaa227aa" containerID="f61b6fdaf5082cc972874d4821a3997f7a35a0298af369c799403101a49808b1" exitCode=0 Apr 22 19:07:02.361327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:02.361288 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerDied","Data":"f61b6fdaf5082cc972874d4821a3997f7a35a0298af369c799403101a49808b1"} Apr 22 19:07:03.218275 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:03.218237 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:03.218444 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:03.218249 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:07:03.218444 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:03.218377 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:07:03.218533 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:03.218458 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:07:05.218830 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.218797 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:07:05.219386 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.218844 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:05.219386 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.218954 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:07:05.219386 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.219052 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bc5ws" podUID="bfdf183b-244f-461d-96f1-6416cbc8cf68" Apr 22 19:07:05.390786 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.390693 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-22.ec2.internal" event="NodeReady" Apr 22 19:07:05.390973 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.390854 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:07:05.434471 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.434437 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lw7xs"] Apr 22 19:07:05.454668 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.454644 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-b5ggr"] Apr 22 19:07:05.454820 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.454795 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.457323 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.457081 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:07:05.457323 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.457113 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:07:05.457323 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.457177 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2cl86\"" Apr 22 19:07:05.476784 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.476742 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lw7xs"] Apr 22 19:07:05.476784 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.476786 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b5ggr"] Apr 22 19:07:05.476974 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.476901 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:05.479293 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.479262 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:07:05.479398 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.479296 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:07:05.479670 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.479647 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lr5f4\"" Apr 22 19:07:05.479670 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.479660 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:07:05.546897 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.546865 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86f8ea02-993b-4c12-b611-355d4b6cd91c-tmp-dir\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.547060 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.546916 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.547060 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.546947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f8ea02-993b-4c12-b611-355d4b6cd91c-config-volume\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.547060 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.547046 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq4vf\" (UniqueName: \"kubernetes.io/projected/86f8ea02-993b-4c12-b611-355d4b6cd91c-kube-api-access-qq4vf\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.648288 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648208 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq4vf\" (UniqueName: \"kubernetes.io/projected/86f8ea02-993b-4c12-b611-355d4b6cd91c-kube-api-access-qq4vf\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.648288 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86f8ea02-993b-4c12-b611-355d4b6cd91c-tmp-dir\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.648533 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648292 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjwrh\" (UniqueName: \"kubernetes.io/projected/21e125c2-4036-4304-91d4-c0370710d4af-kube-api-access-mjwrh\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:05.648533 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648335 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.648533 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f8ea02-993b-4c12-b611-355d4b6cd91c-config-volume\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.648533 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648383 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:05.648533 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.648477 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:05.648742 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.648542 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:06.148520926 +0000 UTC m=+34.513935740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:05.648742 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648705 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/86f8ea02-993b-4c12-b611-355d4b6cd91c-tmp-dir\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.649011 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.648991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f8ea02-993b-4c12-b611-355d4b6cd91c-config-volume\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.661883 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.661856 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq4vf\" (UniqueName: \"kubernetes.io/projected/86f8ea02-993b-4c12-b611-355d4b6cd91c-kube-api-access-qq4vf\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:05.749062 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.749032 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:05.749220 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.749098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjwrh\" (UniqueName: \"kubernetes.io/projected/21e125c2-4036-4304-91d4-c0370710d4af-kube-api-access-mjwrh\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:05.749220 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.749196 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:05.749318 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.749285 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:07:06.249257285 +0000 UTC m=+34.614672103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:05.758221 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.758196 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjwrh\" (UniqueName: \"kubernetes.io/projected/21e125c2-4036-4304-91d4-c0370710d4af-kube-api-access-mjwrh\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:05.849419 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:05.849392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:07:05.849529 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.849496 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:07:05.849567 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:05.849542 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:37.849530412 +0000 UTC m=+66.214945226 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:07:06.050599 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:06.050565 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:06.050800 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.050744 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:07:06.050800 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.050787 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:07:06.050800 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.050797 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q7972 for pod openshift-network-diagnostics/network-check-target-bc5ws: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:07:06.050931 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.050864 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972 podName:bfdf183b-244f-461d-96f1-6416cbc8cf68 nodeName:}" failed. No retries permitted until 2026-04-22 19:07:38.050847429 +0000 UTC m=+66.416262267 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-q7972" (UniqueName: "kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972") pod "network-check-target-bc5ws" (UID: "bfdf183b-244f-461d-96f1-6416cbc8cf68") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:07:06.150935 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:06.150906 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:06.151114 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.151028 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:06.151114 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.151082 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:07.151067411 +0000 UTC m=+35.516482224 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:06.252129 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:06.252095 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:06.252778 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.252255 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:06.252778 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:06.252328 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:07:07.252308386 +0000 UTC m=+35.617723201 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:07.158254 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.158226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:07.158440 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:07.158338 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:07.158440 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:07.158407 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:09.158388326 +0000 UTC m=+37.523803141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:07.218492 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.218451 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:07:07.218670 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.218451 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:07.222572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.222261 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:07:07.222572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.222300 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:07:07.222572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.222281 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjzgr\"" Apr 22 19:07:07.222572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.222305 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6t9wx\"" Apr 22 19:07:07.222572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.222267 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:07:07.258979 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:07.258946 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:07.259350 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:07.259121 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:07.259350 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:07.259196 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:07:09.259180569 +0000 UTC m=+37.624595382 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:09.174835 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:09.174654 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:09.174835 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:09.174818 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:09.175294 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:09.174896 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:13.174872763 +0000 UTC m=+41.540287576 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:09.275519 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:09.275440 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:09.275648 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:09.275581 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:09.275648 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:09.275641 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:07:13.275626146 +0000 UTC m=+41.641040959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:09.376536 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:09.376503 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e9b0a71-0187-42db-855a-762dfaa227aa" containerID="db4bc84989699ed3ec0799548676366078e1542567b7bb9dd3ba4a87e478d9d2" exitCode=0 Apr 22 19:07:09.376688 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:09.376559 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerDied","Data":"db4bc84989699ed3ec0799548676366078e1542567b7bb9dd3ba4a87e478d9d2"} Apr 22 19:07:10.380617 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:10.380582 2569 generic.go:358] "Generic (PLEG): container finished" podID="1e9b0a71-0187-42db-855a-762dfaa227aa" containerID="af185200f6a8635e8755feb63fc6d05ff2d7b41fcbb10ca908ac417bd0c0fe76" exitCode=0 Apr 22 19:07:10.380617 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:10.380619 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerDied","Data":"af185200f6a8635e8755feb63fc6d05ff2d7b41fcbb10ca908ac417bd0c0fe76"} Apr 22 19:07:11.384980 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:11.384947 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" event={"ID":"1e9b0a71-0187-42db-855a-762dfaa227aa","Type":"ContainerStarted","Data":"89edeb5be4d8866dcfa7ad2819d12c32ef399dec8c6c7998ae890b38406b8b40"} Apr 22 19:07:11.407588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:11.407543 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-x7gv6" podStartSLOduration=6.087467976 podStartE2EDuration="39.40752916s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:06:34.941216442 +0000 UTC m=+3.306631258" lastFinishedPulling="2026-04-22 19:07:08.261277629 +0000 UTC m=+36.626692442" observedRunningTime="2026-04-22 19:07:11.406036138 +0000 UTC m=+39.771450973" watchObservedRunningTime="2026-04-22 19:07:11.40752916 +0000 UTC m=+39.772944016" Apr 22 19:07:13.197175 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:13.197139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:13.197514 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:13.197256 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:13.197514 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:13.197340 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:21.197327368 +0000 UTC m=+49.562742182 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:13.297684 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:13.297653 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:13.297824 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:13.297808 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:13.297880 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:13.297862 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:07:21.297847464 +0000 UTC m=+49.663262278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:19.949301 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.949272 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md"] Apr 22 19:07:19.961062 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.961038 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md"] Apr 22 19:07:19.961178 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.961161 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:19.964848 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.964824 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 19:07:19.964991 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.964869 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-rwj7c\"" Apr 22 19:07:19.964991 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.964825 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 19:07:19.964991 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.964938 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 19:07:19.964991 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:19.964905 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 19:07:20.140730 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.140696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzl6\" (UniqueName: \"kubernetes.io/projected/c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d-kube-api-access-chzl6\") pod \"managed-serviceaccount-addon-agent-788dd9b5f-cg7md\" (UID: \"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.140730 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.140738 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-788dd9b5f-cg7md\" (UID: \"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.241246 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.241175 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chzl6\" (UniqueName: \"kubernetes.io/projected/c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d-kube-api-access-chzl6\") pod \"managed-serviceaccount-addon-agent-788dd9b5f-cg7md\" (UID: \"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.241246 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.241205 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-788dd9b5f-cg7md\" (UID: \"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.244523 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.244501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-788dd9b5f-cg7md\" (UID: \"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.250282 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.250258 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzl6\" (UniqueName: \"kubernetes.io/projected/c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d-kube-api-access-chzl6\") pod \"managed-serviceaccount-addon-agent-788dd9b5f-cg7md\" (UID: \"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.279261 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.279237 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" Apr 22 19:07:20.405152 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:20.405114 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md"] Apr 22 19:07:20.408774 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:07:20.408724 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73375b5_d6be_48ff_ac60_7b2e0cc9ef1d.slice/crio-af7bc8615806da6f4b689facc0fd2fc2a10e66552254d5898e4464620357ffeb WatchSource:0}: Error finding container af7bc8615806da6f4b689facc0fd2fc2a10e66552254d5898e4464620357ffeb: Status 404 returned error can't find the container with id af7bc8615806da6f4b689facc0fd2fc2a10e66552254d5898e4464620357ffeb Apr 22 19:07:21.248173 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:21.248143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:21.248550 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:21.248291 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:21.248550 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:21.248372 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:07:37.24835164 +0000 UTC m=+65.613766458 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:21.349249 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:21.349214 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:21.349438 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:21.349382 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:21.349507 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:21.349452 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:07:37.349428821 +0000 UTC m=+65.714843640 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:21.405451 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:21.405398 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" event={"ID":"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d","Type":"ContainerStarted","Data":"af7bc8615806da6f4b689facc0fd2fc2a10e66552254d5898e4464620357ffeb"} Apr 22 19:07:24.412352 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:24.412316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" event={"ID":"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d","Type":"ContainerStarted","Data":"5dc92bd2a933b842f7bf67cf70ed541ecea32b137f0d60770bbcfee916a86f3c"} Apr 22 19:07:24.426870 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:24.426826 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" podStartSLOduration=2.424081863 podStartE2EDuration="5.426813176s" podCreationTimestamp="2026-04-22 19:07:19 +0000 UTC" firstStartedPulling="2026-04-22 19:07:20.41061944 +0000 UTC m=+48.776034254" lastFinishedPulling="2026-04-22 19:07:23.413350749 +0000 UTC m=+51.778765567" observedRunningTime="2026-04-22 19:07:24.426627067 +0000 UTC m=+52.792041904" watchObservedRunningTime="2026-04-22 19:07:24.426813176 +0000 UTC m=+52.792228015" Apr 22 19:07:30.376131 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:30.376102 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tfk46" Apr 22 19:07:37.260853 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:37.260810 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:07:37.261237 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:37.260967 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:07:37.261237 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:37.261047 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:08:09.261029055 +0000 UTC m=+97.626443873 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:07:37.361313 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:37.361278 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:07:37.361478 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:37.361388 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:07:37.361478 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:37.361447 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:08:09.361433687 +0000 UTC m=+97.726848501 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:07:37.864916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:37.864882 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:07:37.868016 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:37.867996 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:07:37.875505 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:37.875491 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:07:37.875578 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:07:37.875569 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:08:41.875553236 +0000 UTC m=+130.240968049 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : secret "metrics-daemon-secret" not found Apr 22 19:07:38.065700 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.065662 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:38.068645 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.068627 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:07:38.079282 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.079264 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:07:38.089934 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.089905 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7972\" (UniqueName: \"kubernetes.io/projected/bfdf183b-244f-461d-96f1-6416cbc8cf68-kube-api-access-q7972\") pod \"network-check-target-bc5ws\" (UID: \"bfdf183b-244f-461d-96f1-6416cbc8cf68\") " pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:38.139470 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.139413 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6t9wx\"" Apr 22 19:07:38.147404 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.147388 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:07:38.277163 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.277138 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bc5ws"] Apr 22 19:07:38.279810 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:07:38.279779 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfdf183b_244f_461d_96f1_6416cbc8cf68.slice/crio-187d0661b1132a4e163fb175594df375f70a99a3cec6b8cfa1e6ef894ac5ec11 WatchSource:0}: Error finding container 187d0661b1132a4e163fb175594df375f70a99a3cec6b8cfa1e6ef894ac5ec11: Status 404 returned error can't find the container with id 187d0661b1132a4e163fb175594df375f70a99a3cec6b8cfa1e6ef894ac5ec11 Apr 22 19:07:38.439367 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:38.439337 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bc5ws" event={"ID":"bfdf183b-244f-461d-96f1-6416cbc8cf68","Type":"ContainerStarted","Data":"187d0661b1132a4e163fb175594df375f70a99a3cec6b8cfa1e6ef894ac5ec11"} Apr 22 19:07:41.445741 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:41.445710 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bc5ws" event={"ID":"bfdf183b-244f-461d-96f1-6416cbc8cf68","Type":"ContainerStarted","Data":"088e696b1360588bae9c712b5b18ea81194ec5d26fb9f75b5e32a281576b9952"} Apr 22 19:07:41.446105 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:07:41.445824 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:08:09.274708 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:09.274620 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:08:09.275187 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:09.274793 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:08:09.275888 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:09.275864 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls podName:86f8ea02-993b-4c12-b611-355d4b6cd91c nodeName:}" failed. No retries permitted until 2026-04-22 19:09:13.275824422 +0000 UTC m=+161.641239265 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls") pod "dns-default-lw7xs" (UID: "86f8ea02-993b-4c12-b611-355d4b6cd91c") : secret "dns-default-metrics-tls" not found Apr 22 19:08:09.375947 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:09.375917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:08:09.376089 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:09.376015 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:08:09.376089 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:09.376071 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert podName:21e125c2-4036-4304-91d4-c0370710d4af nodeName:}" failed. No retries permitted until 2026-04-22 19:09:13.376058741 +0000 UTC m=+161.741473554 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert") pod "ingress-canary-b5ggr" (UID: "21e125c2-4036-4304-91d4-c0370710d4af") : secret "canary-serving-cert" not found Apr 22 19:08:12.450595 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:12.450568 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bc5ws" Apr 22 19:08:12.469206 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:12.469160 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bc5ws" podStartSLOduration=97.874653615 podStartE2EDuration="1m40.469147231s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:07:38.28207868 +0000 UTC m=+66.647493494" lastFinishedPulling="2026-04-22 19:07:40.876572292 +0000 UTC m=+69.241987110" observedRunningTime="2026-04-22 19:07:41.473304025 +0000 UTC m=+69.838718862" watchObservedRunningTime="2026-04-22 19:08:12.469147231 +0000 UTC m=+100.834562111" Apr 22 19:08:41.894597 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:41.894542 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:08:41.895115 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:41.894665 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:08:41.895115 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:41.894726 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs podName:46a3468d-b017-471c-a0df-a07b1c183ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:10:43.894711326 +0000 UTC m=+252.260126144 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs") pod "network-metrics-daemon-2r8qp" (UID: "46a3468d-b017-471c-a0df-a07b1c183ff4") : secret "metrics-daemon-secret" not found Apr 22 19:08:42.169028 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.168942 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fl4n2"] Apr 22 19:08:42.171646 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.171631 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.174289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.174267 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.174289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.174286 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:08:42.175414 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.175367 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9w595\"" Apr 22 19:08:42.175608 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.175439 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.175716 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.175476 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:08:42.185157 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.185025 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:08:42.186366 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.186344 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fl4n2"] Apr 22 19:08:42.268314 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.268286 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf"] Apr 22 19:08:42.271359 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.271336 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w489z"] Apr 22 19:08:42.271501 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.271483 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.274122 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.274100 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.274645 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.274622 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.274645 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.274639 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qk6kj\"" Apr 22 19:08:42.274853 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.274692 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 19:08:42.275000 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.274986 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.275057 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.275044 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 19:08:42.275850 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.275837 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq"] Apr 22 19:08:42.276955 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.276937 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-fsk9w\"" Apr 22 19:08:42.277508 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.277487 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 19:08:42.277572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.277510 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.277606 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.277497 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 19:08:42.277815 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.277803 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.278316 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.278264 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cdfc8c88c-qgsqc"] Apr 22 19:08:42.278560 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.278543 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.280919 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.280904 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zglx8\"" Apr 22 19:08:42.281305 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.281284 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 19:08:42.281405 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.281320 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 19:08:42.281405 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.281337 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:08:42.281722 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.281708 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 19:08:42.281891 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.281877 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.284790 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.284748 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:08:42.284989 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.284966 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:08:42.285087 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.285030 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:08:42.285087 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.285052 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-px248\"" Apr 22 19:08:42.286831 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.286814 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 19:08:42.290365 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.290348 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:08:42.294642 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.294424 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq"] Apr 22 19:08:42.295838 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.295815 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf"] Apr 22 19:08:42.296649 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.296629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmsc\" (UniqueName: \"kubernetes.io/projected/c9d5594b-5dc2-461d-bd58-496386ced33b-kube-api-access-8bmsc\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.296771 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.296663 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d5594b-5dc2-461d-bd58-496386ced33b-config\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.296771 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.296741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d5594b-5dc2-461d-bd58-496386ced33b-serving-cert\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.296890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.296788 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9d5594b-5dc2-461d-bd58-496386ced33b-trusted-ca\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.297283 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.297266 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w489z"] Apr 22 19:08:42.297573 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.297558 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cdfc8c88c-qgsqc"] Apr 22 19:08:42.397124 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-certificates\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397124 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397123 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m7zn\" (UniqueName: \"kubernetes.io/projected/0886168b-fb42-4ca3-81f5-2dabb41537e9-kube-api-access-7m7zn\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397145 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5676ee9e-cd52-496c-a3cc-32f120c108d4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397172 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0886168b-fb42-4ca3-81f5-2dabb41537e9-serving-cert\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5676ee9e-cd52-496c-a3cc-32f120c108d4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397250 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-installation-pull-secrets\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397276 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-image-registry-private-configuration\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397294 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmrk\" (UniqueName: \"kubernetes.io/projected/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-kube-api-access-4gmrk\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-trusted-ca\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397361 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0886168b-fb42-4ca3-81f5-2dabb41537e9-service-ca-bundle\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.397387 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0886168b-fb42-4ca3-81f5-2dabb41537e9-tmp\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397398 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-bound-sa-token\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397423 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397481 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bmsc\" (UniqueName: \"kubernetes.io/projected/c9d5594b-5dc2-461d-bd58-496386ced33b-kube-api-access-8bmsc\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-ca-trust-extracted\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d5594b-5dc2-461d-bd58-496386ced33b-config\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9lmv\" (UniqueName: \"kubernetes.io/projected/5676ee9e-cd52-496c-a3cc-32f120c108d4-kube-api-access-q9lmv\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0886168b-fb42-4ca3-81f5-2dabb41537e9-snapshots\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397656 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-config\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397705 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d5594b-5dc2-461d-bd58-496386ced33b-serving-cert\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397731 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.397816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397776 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrr5f\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-kube-api-access-zrr5f\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.398337 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397832 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9d5594b-5dc2-461d-bd58-496386ced33b-trusted-ca\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.398337 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.397867 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0886168b-fb42-4ca3-81f5-2dabb41537e9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.398337 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.398225 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d5594b-5dc2-461d-bd58-496386ced33b-config\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.398630 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.398609 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c9d5594b-5dc2-461d-bd58-496386ced33b-trusted-ca\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.400081 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.400062 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9d5594b-5dc2-461d-bd58-496386ced33b-serving-cert\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.405651 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.405632 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bmsc\" (UniqueName: \"kubernetes.io/projected/c9d5594b-5dc2-461d-bd58-496386ced33b-kube-api-access-8bmsc\") pod \"console-operator-9d4b6777b-fl4n2\" (UID: \"c9d5594b-5dc2-461d-bd58-496386ced33b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.480903 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.480858 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:42.498822 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.498790 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0886168b-fb42-4ca3-81f5-2dabb41537e9-serving-cert\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.498955 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.498843 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5676ee9e-cd52-496c-a3cc-32f120c108d4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.498955 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.498877 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-installation-pull-secrets\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.498955 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.498906 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-image-registry-private-configuration\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499116 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmrk\" (UniqueName: \"kubernetes.io/projected/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-kube-api-access-4gmrk\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.499183 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499158 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-trusted-ca\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499237 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499200 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0886168b-fb42-4ca3-81f5-2dabb41537e9-service-ca-bundle\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.499300 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499233 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0886168b-fb42-4ca3-81f5-2dabb41537e9-tmp\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.499300 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499264 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-bound-sa-token\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499300 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499296 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499443 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-ca-trust-extracted\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499443 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499362 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9lmv\" (UniqueName: \"kubernetes.io/projected/5676ee9e-cd52-496c-a3cc-32f120c108d4-kube-api-access-q9lmv\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.499443 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499392 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0886168b-fb42-4ca3-81f5-2dabb41537e9-snapshots\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.499783 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499505 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-config\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.499783 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499711 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.499783 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrr5f\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-kube-api-access-zrr5f\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499950 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0886168b-fb42-4ca3-81f5-2dabb41537e9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.499950 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499848 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-certificates\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.499950 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499883 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m7zn\" (UniqueName: \"kubernetes.io/projected/0886168b-fb42-4ca3-81f5-2dabb41537e9-kube-api-access-7m7zn\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.499950 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.499917 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5676ee9e-cd52-496c-a3cc-32f120c108d4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.500798 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.500772 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0886168b-fb42-4ca3-81f5-2dabb41537e9-snapshots\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.500986 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.500959 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-config\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.501353 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.501327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5676ee9e-cd52-496c-a3cc-32f120c108d4-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.501657 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.501636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0886168b-fb42-4ca3-81f5-2dabb41537e9-service-ca-bundle\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.501742 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.501653 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0886168b-fb42-4ca3-81f5-2dabb41537e9-serving-cert\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.501742 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:42.501733 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:08:42.501928 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:42.501770 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cdfc8c88c-qgsqc: secret "image-registry-tls" not found Apr 22 19:08:42.501928 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:42.501828 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls podName:732f281f-7aeb-4e6e-b46f-a1b52deacf5e nodeName:}" failed. No retries permitted until 2026-04-22 19:08:43.001807243 +0000 UTC m=+131.367222071 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls") pod "image-registry-5cdfc8c88c-qgsqc" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e") : secret "image-registry-tls" not found Apr 22 19:08:42.502091 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502073 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-ca-trust-extracted\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.502573 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502284 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0886168b-fb42-4ca3-81f5-2dabb41537e9-tmp\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.502573 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0886168b-fb42-4ca3-81f5-2dabb41537e9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.502573 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-image-registry-private-configuration\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.502573 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-installation-pull-secrets\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.502853 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502655 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-trusted-ca\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.502853 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.502776 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-certificates\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.503242 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.503219 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5676ee9e-cd52-496c-a3cc-32f120c108d4-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.504306 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.504285 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.508521 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.508501 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmrk\" (UniqueName: \"kubernetes.io/projected/64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71-kube-api-access-4gmrk\") pod \"service-ca-operator-d6fc45fc5-qhjdq\" (UID: \"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.508735 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.508711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrr5f\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-kube-api-access-zrr5f\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.510579 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.510551 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9lmv\" (UniqueName: \"kubernetes.io/projected/5676ee9e-cd52-496c-a3cc-32f120c108d4-kube-api-access-q9lmv\") pod \"kube-storage-version-migrator-operator-6769c5d45-k6rkf\" (UID: \"5676ee9e-cd52-496c-a3cc-32f120c108d4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.511327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.511308 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-bound-sa-token\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:42.511406 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.511314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m7zn\" (UniqueName: \"kubernetes.io/projected/0886168b-fb42-4ca3-81f5-2dabb41537e9-kube-api-access-7m7zn\") pod \"insights-operator-585dfdc468-w489z\" (UID: \"0886168b-fb42-4ca3-81f5-2dabb41537e9\") " pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.582010 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.581982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" Apr 22 19:08:42.589293 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.589264 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w489z" Apr 22 19:08:42.601022 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.600887 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" Apr 22 19:08:42.601316 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.601297 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fl4n2"] Apr 22 19:08:42.604918 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:08:42.604889 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d5594b_5dc2_461d_bd58_496386ced33b.slice/crio-ecfebaa29586f188b82d9ae9246cd3becdbc7096f40d3e3431d5e3bc18e405a0 WatchSource:0}: Error finding container ecfebaa29586f188b82d9ae9246cd3becdbc7096f40d3e3431d5e3bc18e405a0: Status 404 returned error can't find the container with id ecfebaa29586f188b82d9ae9246cd3becdbc7096f40d3e3431d5e3bc18e405a0 Apr 22 19:08:42.718912 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.718881 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf"] Apr 22 19:08:42.723970 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:08:42.723940 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5676ee9e_cd52_496c_a3cc_32f120c108d4.slice/crio-4700dd5f9540d1d88ba98e306afee7b86c43cfa31456733184b87b87bb9ddee7 WatchSource:0}: Error finding container 4700dd5f9540d1d88ba98e306afee7b86c43cfa31456733184b87b87bb9ddee7: Status 404 returned error can't find the container with id 4700dd5f9540d1d88ba98e306afee7b86c43cfa31456733184b87b87bb9ddee7 Apr 22 19:08:42.737102 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.737046 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w489z"] Apr 22 19:08:42.740204 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:08:42.740174 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0886168b_fb42_4ca3_81f5_2dabb41537e9.slice/crio-72765404a599f4ade91399d9895cbbb133cbf2548ebb4a0914ff9c89b71e3105 WatchSource:0}: Error finding container 72765404a599f4ade91399d9895cbbb133cbf2548ebb4a0914ff9c89b71e3105: Status 404 returned error can't find the container with id 72765404a599f4ade91399d9895cbbb133cbf2548ebb4a0914ff9c89b71e3105 Apr 22 19:08:42.755842 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:42.755816 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq"] Apr 22 19:08:42.759950 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:08:42.759927 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64cf0725_ca9c_41d6_a4dd_8d9f1b74ed71.slice/crio-f60199a7189e44b8c36fabe0dfddb7af8a0ff7956f857dc4b0a277c0dc5627c0 WatchSource:0}: Error finding container f60199a7189e44b8c36fabe0dfddb7af8a0ff7956f857dc4b0a277c0dc5627c0: Status 404 returned error can't find the container with id f60199a7189e44b8c36fabe0dfddb7af8a0ff7956f857dc4b0a277c0dc5627c0 Apr 22 19:08:43.004843 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:43.004732 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:43.005183 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:43.004903 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:08:43.005183 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:43.004924 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cdfc8c88c-qgsqc: secret "image-registry-tls" not found Apr 22 19:08:43.005183 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:43.004983 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls podName:732f281f-7aeb-4e6e-b46f-a1b52deacf5e nodeName:}" failed. No retries permitted until 2026-04-22 19:08:44.004965072 +0000 UTC m=+132.370379892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls") pod "image-registry-5cdfc8c88c-qgsqc" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e") : secret "image-registry-tls" not found Apr 22 19:08:43.566497 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:43.566458 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" event={"ID":"5676ee9e-cd52-496c-a3cc-32f120c108d4","Type":"ContainerStarted","Data":"4700dd5f9540d1d88ba98e306afee7b86c43cfa31456733184b87b87bb9ddee7"} Apr 22 19:08:43.568667 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:43.568632 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" event={"ID":"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71","Type":"ContainerStarted","Data":"f60199a7189e44b8c36fabe0dfddb7af8a0ff7956f857dc4b0a277c0dc5627c0"} Apr 22 19:08:43.570093 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:43.570056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" event={"ID":"c9d5594b-5dc2-461d-bd58-496386ced33b","Type":"ContainerStarted","Data":"ecfebaa29586f188b82d9ae9246cd3becdbc7096f40d3e3431d5e3bc18e405a0"} Apr 22 19:08:43.571623 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:43.571595 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w489z" event={"ID":"0886168b-fb42-4ca3-81f5-2dabb41537e9","Type":"ContainerStarted","Data":"72765404a599f4ade91399d9895cbbb133cbf2548ebb4a0914ff9c89b71e3105"} Apr 22 19:08:44.013654 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:44.013525 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:44.014089 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:44.013793 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:08:44.014089 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:44.013813 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cdfc8c88c-qgsqc: secret "image-registry-tls" not found Apr 22 19:08:44.014089 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:44.013872 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls podName:732f281f-7aeb-4e6e-b46f-a1b52deacf5e nodeName:}" failed. No retries permitted until 2026-04-22 19:08:46.013853573 +0000 UTC m=+134.379268402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls") pod "image-registry-5cdfc8c88c-qgsqc" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e") : secret "image-registry-tls" not found Apr 22 19:08:46.032686 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.032649 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:46.033066 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:46.032782 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:08:46.033066 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:46.032795 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cdfc8c88c-qgsqc: secret "image-registry-tls" not found Apr 22 19:08:46.033066 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:46.032846 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls podName:732f281f-7aeb-4e6e-b46f-a1b52deacf5e nodeName:}" failed. No retries permitted until 2026-04-22 19:08:50.032830356 +0000 UTC m=+138.398245181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls") pod "image-registry-5cdfc8c88c-qgsqc" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e") : secret "image-registry-tls" not found Apr 22 19:08:46.579839 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.579803 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w489z" event={"ID":"0886168b-fb42-4ca3-81f5-2dabb41537e9","Type":"ContainerStarted","Data":"5bcb243678b43e74160ccf21b44ab0c6b68a217558d88dd65380603a029cb31c"} Apr 22 19:08:46.581117 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.581085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" event={"ID":"5676ee9e-cd52-496c-a3cc-32f120c108d4","Type":"ContainerStarted","Data":"02f9f0cfb358771fa22690438d84c815450a16728bf441b531f0b9b933107691"} Apr 22 19:08:46.582435 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.582410 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" event={"ID":"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71","Type":"ContainerStarted","Data":"c5b5f5e696cd81b68240ce5b4d5b91884616a1226909ced520533e99358b6cbc"} Apr 22 19:08:46.584046 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.584023 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/0.log" Apr 22 19:08:46.584155 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.584064 2569 generic.go:358] "Generic (PLEG): container finished" podID="c9d5594b-5dc2-461d-bd58-496386ced33b" containerID="6c7eb90d168b89fe276f75d49a5008062cd64069912c594c028484acc8c228f8" exitCode=255 Apr 22 19:08:46.584155 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.584112 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" event={"ID":"c9d5594b-5dc2-461d-bd58-496386ced33b","Type":"ContainerDied","Data":"6c7eb90d168b89fe276f75d49a5008062cd64069912c594c028484acc8c228f8"} Apr 22 19:08:46.584357 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.584340 2569 scope.go:117] "RemoveContainer" containerID="6c7eb90d168b89fe276f75d49a5008062cd64069912c594c028484acc8c228f8" Apr 22 19:08:46.599512 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.599461 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-w489z" podStartSLOduration=1.246931366 podStartE2EDuration="4.599445217s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:08:42.74202261 +0000 UTC m=+131.107437427" lastFinishedPulling="2026-04-22 19:08:46.094536462 +0000 UTC m=+134.459951278" observedRunningTime="2026-04-22 19:08:46.598980899 +0000 UTC m=+134.964395739" watchObservedRunningTime="2026-04-22 19:08:46.599445217 +0000 UTC m=+134.964860054" Apr 22 19:08:46.620944 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.620299 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" podStartSLOduration=1.281400953 podStartE2EDuration="4.620279837s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:08:42.761628989 +0000 UTC m=+131.127043803" lastFinishedPulling="2026-04-22 19:08:46.100507868 +0000 UTC m=+134.465922687" observedRunningTime="2026-04-22 19:08:46.619309658 +0000 UTC m=+134.984724498" watchObservedRunningTime="2026-04-22 19:08:46.620279837 +0000 UTC m=+134.985694676" Apr 22 19:08:46.642364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:46.642315 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" podStartSLOduration=1.272505709 podStartE2EDuration="4.642300579s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:08:42.727479368 +0000 UTC m=+131.092894187" lastFinishedPulling="2026-04-22 19:08:46.097274242 +0000 UTC m=+134.462689057" observedRunningTime="2026-04-22 19:08:46.640890086 +0000 UTC m=+135.006304934" watchObservedRunningTime="2026-04-22 19:08:46.642300579 +0000 UTC m=+135.007715448" Apr 22 19:08:47.589709 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:47.589684 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/1.log" Apr 22 19:08:47.590199 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:47.590123 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/0.log" Apr 22 19:08:47.590199 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:47.590155 2569 generic.go:358] "Generic (PLEG): container finished" podID="c9d5594b-5dc2-461d-bd58-496386ced33b" containerID="b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4" exitCode=255 Apr 22 19:08:47.590199 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:47.590186 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" event={"ID":"c9d5594b-5dc2-461d-bd58-496386ced33b","Type":"ContainerDied","Data":"b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4"} Apr 22 19:08:47.590366 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:47.590243 2569 scope.go:117] "RemoveContainer" containerID="6c7eb90d168b89fe276f75d49a5008062cd64069912c594c028484acc8c228f8" Apr 22 19:08:47.590528 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:47.590500 2569 scope.go:117] "RemoveContainer" containerID="b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4" Apr 22 19:08:47.590740 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:47.590717 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fl4n2_openshift-console-operator(c9d5594b-5dc2-461d-bd58-496386ced33b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podUID="c9d5594b-5dc2-461d-bd58-496386ced33b" Apr 22 19:08:48.594124 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:48.594094 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/1.log" Apr 22 19:08:48.594483 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:48.594435 2569 scope.go:117] "RemoveContainer" containerID="b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4" Apr 22 19:08:48.594621 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:48.594603 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fl4n2_openshift-console-operator(c9d5594b-5dc2-461d-bd58-496386ced33b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podUID="c9d5594b-5dc2-461d-bd58-496386ced33b" Apr 22 19:08:48.932911 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:48.932882 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2wjct_2f15bc14-85c1-4370-8e8c-dfc474a5636b/dns-node-resolver/0.log" Apr 22 19:08:49.933159 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:49.933132 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hdwcg_8ab2b075-d3d5-4d3a-848e-89344c4f11b6/node-ca/0.log" Apr 22 19:08:50.066340 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:50.066295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:50.066483 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:50.066438 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:08:50.066483 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:50.066455 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cdfc8c88c-qgsqc: secret "image-registry-tls" not found Apr 22 19:08:50.066562 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:50.066508 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls podName:732f281f-7aeb-4e6e-b46f-a1b52deacf5e nodeName:}" failed. No retries permitted until 2026-04-22 19:08:58.066493808 +0000 UTC m=+146.431908621 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls") pod "image-registry-5cdfc8c88c-qgsqc" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e") : secret "image-registry-tls" not found Apr 22 19:08:52.481786 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:52.481735 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:52.481786 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:52.481792 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:08:52.482283 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:52.482201 2569 scope.go:117] "RemoveContainer" containerID="b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4" Apr 22 19:08:52.482412 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:08:52.482389 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fl4n2_openshift-console-operator(c9d5594b-5dc2-461d-bd58-496386ced33b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podUID="c9d5594b-5dc2-461d-bd58-496386ced33b" Apr 22 19:08:58.128577 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.128531 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:58.131092 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.131060 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"image-registry-5cdfc8c88c-qgsqc\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:58.196980 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.196956 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:58.320358 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.320331 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cdfc8c88c-qgsqc"] Apr 22 19:08:58.323406 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:08:58.323359 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod732f281f_7aeb_4e6e_b46f_a1b52deacf5e.slice/crio-554e25aee777ea173aa4e82962a181c7ddc8314ba11c59a7055c6afffad1460e WatchSource:0}: Error finding container 554e25aee777ea173aa4e82962a181c7ddc8314ba11c59a7055c6afffad1460e: Status 404 returned error can't find the container with id 554e25aee777ea173aa4e82962a181c7ddc8314ba11c59a7055c6afffad1460e Apr 22 19:08:58.622642 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.622610 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" event={"ID":"732f281f-7aeb-4e6e-b46f-a1b52deacf5e","Type":"ContainerStarted","Data":"198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978"} Apr 22 19:08:58.622642 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.622648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" event={"ID":"732f281f-7aeb-4e6e-b46f-a1b52deacf5e","Type":"ContainerStarted","Data":"554e25aee777ea173aa4e82962a181c7ddc8314ba11c59a7055c6afffad1460e"} Apr 22 19:08:58.622876 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.622781 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:08:58.642544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:08:58.642455 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" podStartSLOduration=16.642439853 podStartE2EDuration="16.642439853s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:58.641813334 +0000 UTC m=+147.007228170" watchObservedRunningTime="2026-04-22 19:08:58.642439853 +0000 UTC m=+147.007854689" Apr 22 19:09:06.219525 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.219494 2569 scope.go:117] "RemoveContainer" containerID="b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4" Apr 22 19:09:06.642581 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.642506 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:09:06.642893 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.642877 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/1.log" Apr 22 19:09:06.642942 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.642914 2569 generic.go:358] "Generic (PLEG): container finished" podID="c9d5594b-5dc2-461d-bd58-496386ced33b" containerID="b6db44c319a7c13ca181a4786f33a932cf93b696963dbc4b4af2f372d2fa329e" exitCode=255 Apr 22 19:09:06.642977 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.642957 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" event={"ID":"c9d5594b-5dc2-461d-bd58-496386ced33b","Type":"ContainerDied","Data":"b6db44c319a7c13ca181a4786f33a932cf93b696963dbc4b4af2f372d2fa329e"} Apr 22 19:09:06.643008 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.642988 2569 scope.go:117] "RemoveContainer" containerID="b6b74bb3073921af6bfba3b7ac331560330f40eae01f872292ec2e842aa5f2a4" Apr 22 19:09:06.643323 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:06.643304 2569 scope.go:117] "RemoveContainer" containerID="b6db44c319a7c13ca181a4786f33a932cf93b696963dbc4b4af2f372d2fa329e" Apr 22 19:09:06.643508 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:06.643486 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fl4n2_openshift-console-operator(c9d5594b-5dc2-461d-bd58-496386ced33b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podUID="c9d5594b-5dc2-461d-bd58-496386ced33b" Apr 22 19:09:07.647070 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:07.647043 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:09:08.463913 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:08.463858 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lw7xs" podUID="86f8ea02-993b-4c12-b611-355d4b6cd91c" Apr 22 19:09:08.487041 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:08.487001 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-b5ggr" podUID="21e125c2-4036-4304-91d4-c0370710d4af" Apr 22 19:09:08.651329 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:08.651292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lw7xs" Apr 22 19:09:09.119978 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.119943 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-mj44z"] Apr 22 19:09:09.124197 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.124179 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.127651 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.127630 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 19:09:09.129608 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.129590 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cdfc8c88c-qgsqc"] Apr 22 19:09:09.131682 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.131663 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vg9n6\"" Apr 22 19:09:09.132226 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.132212 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 19:09:09.143788 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.143741 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mj44z"] Apr 22 19:09:09.170050 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.170023 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d6f7d67b8-qg2xj"] Apr 22 19:09:09.173105 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.173077 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.190793 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.190726 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d6f7d67b8-qg2xj"] Apr 22 19:09:09.312069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312025 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a53b0aba-7b03-4959-8f7f-085132bd83fa-installation-pull-secrets\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312068 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkkg\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-kube-api-access-5hkkg\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312097 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsp6s\" (UniqueName: \"kubernetes.io/projected/926eba52-2ec6-43ad-9b90-958efbe70d95-kube-api-access-jsp6s\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.312258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312144 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/926eba52-2ec6-43ad-9b90-958efbe70d95-crio-socket\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.312258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312210 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a53b0aba-7b03-4959-8f7f-085132bd83fa-registry-certificates\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312245 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/926eba52-2ec6-43ad-9b90-958efbe70d95-data-volume\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.312418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312265 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/926eba52-2ec6-43ad-9b90-958efbe70d95-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.312418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a53b0aba-7b03-4959-8f7f-085132bd83fa-trusted-ca\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312322 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a53b0aba-7b03-4959-8f7f-085132bd83fa-image-registry-private-configuration\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312364 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-registry-tls\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312409 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/926eba52-2ec6-43ad-9b90-958efbe70d95-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.312586 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312441 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a53b0aba-7b03-4959-8f7f-085132bd83fa-ca-trust-extracted\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.312586 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.312469 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-bound-sa-token\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413056 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/926eba52-2ec6-43ad-9b90-958efbe70d95-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413071 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a53b0aba-7b03-4959-8f7f-085132bd83fa-trusted-ca\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a53b0aba-7b03-4959-8f7f-085132bd83fa-image-registry-private-configuration\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413123 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-registry-tls\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/926eba52-2ec6-43ad-9b90-958efbe70d95-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a53b0aba-7b03-4959-8f7f-085132bd83fa-ca-trust-extracted\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-bound-sa-token\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413251 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a53b0aba-7b03-4959-8f7f-085132bd83fa-installation-pull-secrets\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413594 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413276 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkkg\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-kube-api-access-5hkkg\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413594 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413322 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsp6s\" (UniqueName: \"kubernetes.io/projected/926eba52-2ec6-43ad-9b90-958efbe70d95-kube-api-access-jsp6s\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.413594 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413360 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/926eba52-2ec6-43ad-9b90-958efbe70d95-crio-socket\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.413594 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413410 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a53b0aba-7b03-4959-8f7f-085132bd83fa-registry-certificates\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.413594 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413445 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/926eba52-2ec6-43ad-9b90-958efbe70d95-data-volume\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.413868 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413804 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/926eba52-2ec6-43ad-9b90-958efbe70d95-data-volume\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.413949 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.413924 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/926eba52-2ec6-43ad-9b90-958efbe70d95-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.414145 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.414077 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/926eba52-2ec6-43ad-9b90-958efbe70d95-crio-socket\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.414145 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.414101 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a53b0aba-7b03-4959-8f7f-085132bd83fa-ca-trust-extracted\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.414396 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.414370 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a53b0aba-7b03-4959-8f7f-085132bd83fa-trusted-ca\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.414581 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.414518 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a53b0aba-7b03-4959-8f7f-085132bd83fa-registry-certificates\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.415898 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.415875 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-registry-tls\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.415984 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.415883 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a53b0aba-7b03-4959-8f7f-085132bd83fa-image-registry-private-configuration\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.416362 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.416338 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/926eba52-2ec6-43ad-9b90-958efbe70d95-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.416362 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.416347 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a53b0aba-7b03-4959-8f7f-085132bd83fa-installation-pull-secrets\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.423139 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.423115 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsp6s\" (UniqueName: \"kubernetes.io/projected/926eba52-2ec6-43ad-9b90-958efbe70d95-kube-api-access-jsp6s\") pod \"insights-runtime-extractor-mj44z\" (UID: \"926eba52-2ec6-43ad-9b90-958efbe70d95\") " pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.423596 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.423578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkkg\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-kube-api-access-5hkkg\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.423911 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.423892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a53b0aba-7b03-4959-8f7f-085132bd83fa-bound-sa-token\") pod \"image-registry-6d6f7d67b8-qg2xj\" (UID: \"a53b0aba-7b03-4959-8f7f-085132bd83fa\") " pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.434509 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.434484 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-mj44z" Apr 22 19:09:09.481550 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.481399 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:09.573252 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.573219 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-mj44z"] Apr 22 19:09:09.609439 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.609413 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d6f7d67b8-qg2xj"] Apr 22 19:09:09.612409 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:09.612387 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda53b0aba_7b03_4959_8f7f_085132bd83fa.slice/crio-918f0660e2232b775c84945739f12c68621708589411b489970d8914f3da6e67 WatchSource:0}: Error finding container 918f0660e2232b775c84945739f12c68621708589411b489970d8914f3da6e67: Status 404 returned error can't find the container with id 918f0660e2232b775c84945739f12c68621708589411b489970d8914f3da6e67 Apr 22 19:09:09.655518 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.655489 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mj44z" event={"ID":"926eba52-2ec6-43ad-9b90-958efbe70d95","Type":"ContainerStarted","Data":"7025ba1a74398a01714f356ba23921741a35941cfb4fe558e9d7840c8f2a0b66"} Apr 22 19:09:09.655914 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.655530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mj44z" event={"ID":"926eba52-2ec6-43ad-9b90-958efbe70d95","Type":"ContainerStarted","Data":"39764e00fac761d0d61ca52aff8d9c490fe23b677fcf7fb4a06ee5411b789015"} Apr 22 19:09:09.656580 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:09.656546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" event={"ID":"a53b0aba-7b03-4959-8f7f-085132bd83fa","Type":"ContainerStarted","Data":"918f0660e2232b775c84945739f12c68621708589411b489970d8914f3da6e67"} Apr 22 19:09:10.232339 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:10.232306 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2r8qp" podUID="46a3468d-b017-471c-a0df-a07b1c183ff4" Apr 22 19:09:10.660003 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:10.659906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" event={"ID":"a53b0aba-7b03-4959-8f7f-085132bd83fa","Type":"ContainerStarted","Data":"4f1c0e9551dcde3a281a9334cfa49710538d717119886b3a9725c50db5804936"} Apr 22 19:09:10.660365 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:10.660060 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:10.661456 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:10.661432 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mj44z" event={"ID":"926eba52-2ec6-43ad-9b90-958efbe70d95","Type":"ContainerStarted","Data":"2d21bfff702d102d45590aee9d665427c64578d3e6187284689faa514177ba87"} Apr 22 19:09:10.681930 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:10.681880 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" podStartSLOduration=1.6818682090000001 podStartE2EDuration="1.681868209s" podCreationTimestamp="2026-04-22 19:09:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:09:10.680602625 +0000 UTC m=+159.046017484" watchObservedRunningTime="2026-04-22 19:09:10.681868209 +0000 UTC m=+159.047283044" Apr 22 19:09:12.481232 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:12.481195 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:09:12.481232 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:12.481237 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:09:12.481638 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:12.481566 2569 scope.go:117] "RemoveContainer" containerID="b6db44c319a7c13ca181a4786f33a932cf93b696963dbc4b4af2f372d2fa329e" Apr 22 19:09:12.481741 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:12.481724 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fl4n2_openshift-console-operator(c9d5594b-5dc2-461d-bd58-496386ced33b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podUID="c9d5594b-5dc2-461d-bd58-496386ced33b" Apr 22 19:09:12.667800 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:12.667748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-mj44z" event={"ID":"926eba52-2ec6-43ad-9b90-958efbe70d95","Type":"ContainerStarted","Data":"392695e1fb4f400bdd15a57a56788f4b6b5e542eb082e62864099a3fc0e72add"} Apr 22 19:09:12.689384 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:12.689339 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-mj44z" podStartSLOduration=1.428076326 podStartE2EDuration="3.689322299s" podCreationTimestamp="2026-04-22 19:09:09 +0000 UTC" firstStartedPulling="2026-04-22 19:09:09.631186858 +0000 UTC m=+157.996601688" lastFinishedPulling="2026-04-22 19:09:11.892432843 +0000 UTC m=+160.257847661" observedRunningTime="2026-04-22 19:09:12.689108512 +0000 UTC m=+161.054523371" watchObservedRunningTime="2026-04-22 19:09:12.689322299 +0000 UTC m=+161.054737134" Apr 22 19:09:13.343167 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.343124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:09:13.345522 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.345502 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f8ea02-993b-4c12-b611-355d4b6cd91c-metrics-tls\") pod \"dns-default-lw7xs\" (UID: \"86f8ea02-993b-4c12-b611-355d4b6cd91c\") " pod="openshift-dns/dns-default-lw7xs" Apr 22 19:09:13.443882 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.443838 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:09:13.446318 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.446298 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/21e125c2-4036-4304-91d4-c0370710d4af-cert\") pod \"ingress-canary-b5ggr\" (UID: \"21e125c2-4036-4304-91d4-c0370710d4af\") " pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:09:13.455446 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.455417 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2cl86\"" Apr 22 19:09:13.462848 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.462812 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lw7xs" Apr 22 19:09:13.591004 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.590969 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lw7xs"] Apr 22 19:09:13.594035 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:13.593959 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f8ea02_993b_4c12_b611_355d4b6cd91c.slice/crio-d25986925b13493939dbdb9fb615420581471cbf4f1604981c9dcc0e98bf54de WatchSource:0}: Error finding container d25986925b13493939dbdb9fb615420581471cbf4f1604981c9dcc0e98bf54de: Status 404 returned error can't find the container with id d25986925b13493939dbdb9fb615420581471cbf4f1604981c9dcc0e98bf54de Apr 22 19:09:13.670920 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:13.670875 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lw7xs" event={"ID":"86f8ea02-993b-4c12-b611-355d4b6cd91c","Type":"ContainerStarted","Data":"d25986925b13493939dbdb9fb615420581471cbf4f1604981c9dcc0e98bf54de"} Apr 22 19:09:15.679808 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:15.679773 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lw7xs" event={"ID":"86f8ea02-993b-4c12-b611-355d4b6cd91c","Type":"ContainerStarted","Data":"d0a1efd0adc4f5f046a2672dd8c4276e792216c1105f2c841f0140616ac3642f"} Apr 22 19:09:15.679808 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:15.679813 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lw7xs" event={"ID":"86f8ea02-993b-4c12-b611-355d4b6cd91c","Type":"ContainerStarted","Data":"bcee08254248d457e54c3b7151497624ab2c4bb8dde8cb9287558cc5c926ee59"} Apr 22 19:09:15.680291 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:15.679927 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lw7xs" Apr 22 19:09:15.700152 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:15.700115 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lw7xs" podStartSLOduration=128.965760413 podStartE2EDuration="2m10.700103501s" podCreationTimestamp="2026-04-22 19:07:05 +0000 UTC" firstStartedPulling="2026-04-22 19:09:13.596238351 +0000 UTC m=+161.961653172" lastFinishedPulling="2026-04-22 19:09:15.330581446 +0000 UTC m=+163.695996260" observedRunningTime="2026-04-22 19:09:15.699364454 +0000 UTC m=+164.064779290" watchObservedRunningTime="2026-04-22 19:09:15.700103501 +0000 UTC m=+164.065518365" Apr 22 19:09:19.134645 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:19.134608 2569 patch_prober.go:28] interesting pod/image-registry-5cdfc8c88c-qgsqc container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:09:19.135021 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:19.134664 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" podUID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:09:20.088734 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.088705 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-84qx5"] Apr 22 19:09:20.101051 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.101020 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-84qx5"] Apr 22 19:09:20.101194 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.101154 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.103578 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.103558 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 19:09:20.103903 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.103858 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:09:20.103998 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.103972 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 19:09:20.104062 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.103996 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:09:20.104866 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.104850 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-dflk4\"" Apr 22 19:09:20.104972 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.104958 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 19:09:20.189863 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.189835 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/139b6a93-7c35-47b0-a302-c247308fb15a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.190220 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.189872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.190220 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.189897 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.190220 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.189981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sxc9\" (UniqueName: \"kubernetes.io/projected/139b6a93-7c35-47b0-a302-c247308fb15a-kube-api-access-2sxc9\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.290853 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.290821 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.290988 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.290871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sxc9\" (UniqueName: \"kubernetes.io/projected/139b6a93-7c35-47b0-a302-c247308fb15a-kube-api-access-2sxc9\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.290988 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.290937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/139b6a93-7c35-47b0-a302-c247308fb15a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.290988 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.290963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.291174 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:20.291158 2569 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 19:09:20.291250 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:20.291238 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-tls podName:139b6a93-7c35-47b0-a302-c247308fb15a nodeName:}" failed. No retries permitted until 2026-04-22 19:09:20.791218829 +0000 UTC m=+169.156633643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-84qx5" (UID: "139b6a93-7c35-47b0-a302-c247308fb15a") : secret "prometheus-operator-tls" not found Apr 22 19:09:20.291556 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.291538 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/139b6a93-7c35-47b0-a302-c247308fb15a-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.302657 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.302637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sxc9\" (UniqueName: \"kubernetes.io/projected/139b6a93-7c35-47b0-a302-c247308fb15a-kube-api-access-2sxc9\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.304635 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.304618 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.796212 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.796137 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:20.798529 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:20.798499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/139b6a93-7c35-47b0-a302-c247308fb15a-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-84qx5\" (UID: \"139b6a93-7c35-47b0-a302-c247308fb15a\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:21.011066 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:21.011033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" Apr 22 19:09:21.141504 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:21.141469 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-84qx5"] Apr 22 19:09:21.145225 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:21.145198 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139b6a93_7c35_47b0_a302_c247308fb15a.slice/crio-456d90446af672f973c483c628502c20502c7190e74617f5d4cf06634ffaab1b WatchSource:0}: Error finding container 456d90446af672f973c483c628502c20502c7190e74617f5d4cf06634ffaab1b: Status 404 returned error can't find the container with id 456d90446af672f973c483c628502c20502c7190e74617f5d4cf06634ffaab1b Apr 22 19:09:21.694535 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:21.694502 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" event={"ID":"139b6a93-7c35-47b0-a302-c247308fb15a","Type":"ContainerStarted","Data":"456d90446af672f973c483c628502c20502c7190e74617f5d4cf06634ffaab1b"} Apr 22 19:09:22.221322 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.221290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:09:22.224361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.224340 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lr5f4\"" Apr 22 19:09:22.232671 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.232649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-b5ggr" Apr 22 19:09:22.481819 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.481795 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-b5ggr"] Apr 22 19:09:22.484705 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:22.484683 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e125c2_4036_4304_91d4_c0370710d4af.slice/crio-3db7e67fce49b1101bb851b2fa3313b97cc46b602095c69fa0b37de700cf2437 WatchSource:0}: Error finding container 3db7e67fce49b1101bb851b2fa3313b97cc46b602095c69fa0b37de700cf2437: Status 404 returned error can't find the container with id 3db7e67fce49b1101bb851b2fa3313b97cc46b602095c69fa0b37de700cf2437 Apr 22 19:09:22.698670 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.698631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" event={"ID":"139b6a93-7c35-47b0-a302-c247308fb15a","Type":"ContainerStarted","Data":"cbe3c3ac63d917b977a3a6704bc6dfa90c7e62917535243e593dbecb272c0576"} Apr 22 19:09:22.698670 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.698670 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" event={"ID":"139b6a93-7c35-47b0-a302-c247308fb15a","Type":"ContainerStarted","Data":"847e01529d61146dcce2b1d6a572ea448e608f05fc46c0ddb39312af3b13364e"} Apr 22 19:09:22.699613 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.699589 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b5ggr" event={"ID":"21e125c2-4036-4304-91d4-c0370710d4af","Type":"ContainerStarted","Data":"3db7e67fce49b1101bb851b2fa3313b97cc46b602095c69fa0b37de700cf2437"} Apr 22 19:09:22.722914 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:22.722870 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-84qx5" podStartSLOduration=1.454634423 podStartE2EDuration="2.722858585s" podCreationTimestamp="2026-04-22 19:09:20 +0000 UTC" firstStartedPulling="2026-04-22 19:09:21.147121043 +0000 UTC m=+169.512535857" lastFinishedPulling="2026-04-22 19:09:22.415345191 +0000 UTC m=+170.780760019" observedRunningTime="2026-04-22 19:09:22.722053955 +0000 UTC m=+171.087468789" watchObservedRunningTime="2026-04-22 19:09:22.722858585 +0000 UTC m=+171.088273420" Apr 22 19:09:23.704726 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:23.704692 2569 generic.go:358] "Generic (PLEG): container finished" podID="c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d" containerID="5dc92bd2a933b842f7bf67cf70ed541ecea32b137f0d60770bbcfee916a86f3c" exitCode=255 Apr 22 19:09:23.705143 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:23.704770 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" event={"ID":"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d","Type":"ContainerDied","Data":"5dc92bd2a933b842f7bf67cf70ed541ecea32b137f0d60770bbcfee916a86f3c"} Apr 22 19:09:23.713305 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:23.713283 2569 scope.go:117] "RemoveContainer" containerID="5dc92bd2a933b842f7bf67cf70ed541ecea32b137f0d60770bbcfee916a86f3c" Apr 22 19:09:24.219170 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:24.219146 2569 scope.go:117] "RemoveContainer" containerID="b6db44c319a7c13ca181a4786f33a932cf93b696963dbc4b4af2f372d2fa329e" Apr 22 19:09:24.219380 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:24.219358 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fl4n2_openshift-console-operator(c9d5594b-5dc2-461d-bd58-496386ced33b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podUID="c9d5594b-5dc2-461d-bd58-496386ced33b" Apr 22 19:09:24.709254 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:24.709213 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-b5ggr" event={"ID":"21e125c2-4036-4304-91d4-c0370710d4af","Type":"ContainerStarted","Data":"071a24be90546df4389c7b17399eb2612a04c24c9d2b45c8e7e3beb43b30473f"} Apr 22 19:09:24.710784 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:24.710741 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-788dd9b5f-cg7md" event={"ID":"c73375b5-d6be-48ff-ac60-7b2e0cc9ef1d","Type":"ContainerStarted","Data":"92a8aa0d7c2fa85650c139f222b90836ad0f491390a1ec9ee3c1f307b26d7ce0"} Apr 22 19:09:24.727735 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:24.727689 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-b5ggr" podStartSLOduration=138.015173469 podStartE2EDuration="2m19.727676509s" podCreationTimestamp="2026-04-22 19:07:05 +0000 UTC" firstStartedPulling="2026-04-22 19:09:22.487071102 +0000 UTC m=+170.852485931" lastFinishedPulling="2026-04-22 19:09:24.199574154 +0000 UTC m=+172.564988971" observedRunningTime="2026-04-22 19:09:24.726378335 +0000 UTC m=+173.091793171" watchObservedRunningTime="2026-04-22 19:09:24.727676509 +0000 UTC m=+173.093091345" Apr 22 19:09:25.218938 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:25.218902 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:09:25.684747 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:25.684720 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lw7xs" Apr 22 19:09:28.664234 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.664203 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zr94k"] Apr 22 19:09:28.667592 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.667568 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.670943 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.670918 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 19:09:28.671093 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.671075 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mncrp\"" Apr 22 19:09:28.671239 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.671222 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 19:09:28.671325 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.671309 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 19:09:28.755048 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755016 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-tls\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755048 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755054 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-root\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755278 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-textfile\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755278 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755109 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-accelerators-collector-config\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755278 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-sys\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755278 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-metrics-client-ca\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755278 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755240 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755451 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755280 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-wtmp\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.755451 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.755358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlm99\" (UniqueName: \"kubernetes.io/projected/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-kube-api-access-rlm99\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856062 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856015 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlm99\" (UniqueName: \"kubernetes.io/projected/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-kube-api-access-rlm99\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856222 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856080 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-tls\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856222 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-root\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856222 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856143 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-textfile\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856222 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-accelerators-collector-config\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856393 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-root\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856393 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856304 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-sys\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856393 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856348 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-metrics-client-ca\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856393 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:28.856358 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 19:09:28.856393 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856619 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:28.856418 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-tls podName:dd8a63da-f501-4aa6-b5a6-6fa86c970f57 nodeName:}" failed. No retries permitted until 2026-04-22 19:09:29.35639552 +0000 UTC m=+177.721810333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-tls") pod "node-exporter-zr94k" (UID: "dd8a63da-f501-4aa6-b5a6-6fa86c970f57") : secret "node-exporter-tls" not found Apr 22 19:09:28.856619 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-sys\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856619 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856453 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-wtmp\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856619 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856561 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-textfile\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856619 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-wtmp\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856816 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856801 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-accelerators-collector-config\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.856878 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.856861 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-metrics-client-ca\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.858803 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.858783 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:28.865226 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:28.865205 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlm99\" (UniqueName: \"kubernetes.io/projected/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-kube-api-access-rlm99\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:29.134048 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.134024 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:09:29.360397 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.360365 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-tls\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:29.362840 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.362815 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/dd8a63da-f501-4aa6-b5a6-6fa86c970f57-node-exporter-tls\") pod \"node-exporter-zr94k\" (UID: \"dd8a63da-f501-4aa6-b5a6-6fa86c970f57\") " pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:29.485954 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.485925 2569 patch_prober.go:28] interesting pod/image-registry-6d6f7d67b8-qg2xj container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 19:09:29.486111 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.485973 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" podUID="a53b0aba-7b03-4959-8f7f-085132bd83fa" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:09:29.577414 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.577387 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zr94k" Apr 22 19:09:29.586092 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:29.586064 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8a63da_f501_4aa6_b5a6_6fa86c970f57.slice/crio-0c0fbd40152836f19bde2525cc776736c4ec38ddbd13a6871ad19e781a67d42a WatchSource:0}: Error finding container 0c0fbd40152836f19bde2525cc776736c4ec38ddbd13a6871ad19e781a67d42a: Status 404 returned error can't find the container with id 0c0fbd40152836f19bde2525cc776736c4ec38ddbd13a6871ad19e781a67d42a Apr 22 19:09:29.724019 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.723982 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zr94k" event={"ID":"dd8a63da-f501-4aa6-b5a6-6fa86c970f57","Type":"ContainerStarted","Data":"0c0fbd40152836f19bde2525cc776736c4ec38ddbd13a6871ad19e781a67d42a"} Apr 22 19:09:29.745903 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.745842 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:09:29.750791 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.750771 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.754245 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.754224 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:09:29.754637 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.754620 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:09:29.754967 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.754950 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:09:29.755247 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755014 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:09:29.755247 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755079 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:09:29.755951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755525 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:09:29.755951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755546 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:09:29.755951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755534 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xmvg7\"" Apr 22 19:09:29.755951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755534 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:09:29.755951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.755820 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:09:29.767217 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.767192 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:09:29.864572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864541 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-volume\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864572 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864571 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864763 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864599 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864763 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864619 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864763 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864763 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864736 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-web-config\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864796 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-out\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864845 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864879 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.864902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.865020 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864922 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.865020 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864947 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.865020 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.864963 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwfqg\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-kube-api-access-dwfqg\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.965557 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965523 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.965734 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965571 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-web-config\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.965734 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965636 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-out\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.965734 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965670 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.965734 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965710 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965736 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965792 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwfqg\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-kube-api-access-dwfqg\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965876 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-volume\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965902 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965942 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.966017 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.965969 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.967886 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.967625 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.967886 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.967646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.968399 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.968371 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.969567 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.969434 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-web-config\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.970563 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.970534 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.971718 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.971673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.971880 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.971852 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.972181 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.972124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.972286 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.972250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.972588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.972399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-volume\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.972588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.972432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.972588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.972545 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-out\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:29.975442 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:29.975420 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwfqg\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-kube-api-access-dwfqg\") pod \"alertmanager-main-0\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:30.061994 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:30.061907 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:09:30.208722 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:30.208686 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:09:30.270252 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:30.270209 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf467273d_4319_4e2b_8284_913a4c5ddd7c.slice/crio-944bf9f318fc71a72b22f5e15108353bb1b43de82b4f2177ac7e26883ca804d6 WatchSource:0}: Error finding container 944bf9f318fc71a72b22f5e15108353bb1b43de82b4f2177ac7e26883ca804d6: Status 404 returned error can't find the container with id 944bf9f318fc71a72b22f5e15108353bb1b43de82b4f2177ac7e26883ca804d6 Apr 22 19:09:30.727994 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:30.727964 2569 generic.go:358] "Generic (PLEG): container finished" podID="dd8a63da-f501-4aa6-b5a6-6fa86c970f57" containerID="054ab00f4af6c5eefbc20b8f232179307865843e9a841de267c3b20afdbe96f1" exitCode=0 Apr 22 19:09:30.728398 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:30.728047 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zr94k" event={"ID":"dd8a63da-f501-4aa6-b5a6-6fa86c970f57","Type":"ContainerDied","Data":"054ab00f4af6c5eefbc20b8f232179307865843e9a841de267c3b20afdbe96f1"} Apr 22 19:09:30.729134 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:30.729115 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"944bf9f318fc71a72b22f5e15108353bb1b43de82b4f2177ac7e26883ca804d6"} Apr 22 19:09:31.654257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.654235 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6897866f4b-mhjxd"] Apr 22 19:09:31.657845 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.657825 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.660791 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.660771 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 19:09:31.660886 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.660807 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 19:09:31.661435 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.661418 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-fzxlr\"" Apr 22 19:09:31.661987 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.661969 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 19:09:31.662072 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.661986 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-flpucd2djl8lh\"" Apr 22 19:09:31.662072 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.662053 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 19:09:31.662142 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.662094 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 19:09:31.668371 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.668354 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d6f7d67b8-qg2xj" Apr 22 19:09:31.671911 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.671890 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6897866f4b-mhjxd"] Apr 22 19:09:31.681702 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681684 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.681806 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-grpc-tls\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.681872 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.681872 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681859 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6ccz\" (UniqueName: \"kubernetes.io/projected/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-kube-api-access-v6ccz\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.681944 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681883 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.681944 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.682032 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681953 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-tls\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.682032 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.681999 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-metrics-client-ca\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.734190 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.734151 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zr94k" event={"ID":"dd8a63da-f501-4aa6-b5a6-6fa86c970f57","Type":"ContainerStarted","Data":"ad23b14260ef9c929a1bf688f7ce3901c52d0ed3253e9929880ef143302c3d80"} Apr 22 19:09:31.734583 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.734197 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zr94k" event={"ID":"dd8a63da-f501-4aa6-b5a6-6fa86c970f57","Type":"ContainerStarted","Data":"fadfb0355f6276485f376c5917fea9cc99b999d2c67ee77c5d8cc0a89cc60368"} Apr 22 19:09:31.735717 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.735689 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" exitCode=0 Apr 22 19:09:31.735848 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.735766 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} Apr 22 19:09:31.758375 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.758285 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zr94k" podStartSLOduration=3.03284944 podStartE2EDuration="3.758272394s" podCreationTimestamp="2026-04-22 19:09:28 +0000 UTC" firstStartedPulling="2026-04-22 19:09:29.587818111 +0000 UTC m=+177.953232925" lastFinishedPulling="2026-04-22 19:09:30.31324105 +0000 UTC m=+178.678655879" observedRunningTime="2026-04-22 19:09:31.758155025 +0000 UTC m=+180.123569862" watchObservedRunningTime="2026-04-22 19:09:31.758272394 +0000 UTC m=+180.123687232" Apr 22 19:09:31.783096 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.783073 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-grpc-tls\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.783189 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.783113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.783369 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.783342 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6ccz\" (UniqueName: \"kubernetes.io/projected/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-kube-api-access-v6ccz\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.783453 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.783405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.783453 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.783448 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.783897 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.783864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-tls\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.784037 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.784017 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-metrics-client-ca\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.784164 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.784138 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.785234 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.785208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-metrics-client-ca\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.786669 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.786641 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.786742 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.786642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.786742 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.786687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-tls\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.786851 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.786832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.786883 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.786845 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.787007 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.786991 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-secret-grpc-tls\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:31.806698 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:31.806682 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6ccz\" (UniqueName: \"kubernetes.io/projected/900eddfd-5ec6-4ec0-93ed-4fd24fe6326c-kube-api-access-v6ccz\") pod \"thanos-querier-6897866f4b-mhjxd\" (UID: \"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c\") " pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:32.008284 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:32.008255 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:32.136986 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:32.136954 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6897866f4b-mhjxd"] Apr 22 19:09:32.139922 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:32.139892 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod900eddfd_5ec6_4ec0_93ed_4fd24fe6326c.slice/crio-7d93abbdc4b6c4aa709ba98202998da566eb441c9ad0db0c8058b58eb2ef4bfd WatchSource:0}: Error finding container 7d93abbdc4b6c4aa709ba98202998da566eb441c9ad0db0c8058b58eb2ef4bfd: Status 404 returned error can't find the container with id 7d93abbdc4b6c4aa709ba98202998da566eb441c9ad0db0c8058b58eb2ef4bfd Apr 22 19:09:32.741560 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:32.741519 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"7d93abbdc4b6c4aa709ba98202998da566eb441c9ad0db0c8058b58eb2ef4bfd"} Apr 22 19:09:33.747439 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:33.747411 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} Apr 22 19:09:33.747439 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:33.747443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} Apr 22 19:09:33.747803 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:33.747453 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} Apr 22 19:09:33.747803 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:33.747462 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} Apr 22 19:09:33.747803 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:33.747469 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} Apr 22 19:09:34.150364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.150317 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" podUID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" containerName="registry" containerID="cri-o://198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978" gracePeriod=30 Apr 22 19:09:34.693490 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.693425 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:09:34.751506 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.751473 2569 generic.go:358] "Generic (PLEG): container finished" podID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" containerID="198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978" exitCode=0 Apr 22 19:09:34.751916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.751518 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" event={"ID":"732f281f-7aeb-4e6e-b46f-a1b52deacf5e","Type":"ContainerDied","Data":"198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978"} Apr 22 19:09:34.751916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.751571 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" event={"ID":"732f281f-7aeb-4e6e-b46f-a1b52deacf5e","Type":"ContainerDied","Data":"554e25aee777ea173aa4e82962a181c7ddc8314ba11c59a7055c6afffad1460e"} Apr 22 19:09:34.751916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.751571 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cdfc8c88c-qgsqc" Apr 22 19:09:34.751916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.751587 2569 scope.go:117] "RemoveContainer" containerID="198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978" Apr 22 19:09:34.753696 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.753669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"8e0e4e9ff4e1c57b444261375743eb712b387f89c34cedd7f4d51a5ee64cfe82"} Apr 22 19:09:34.753813 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.753697 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"9c3b0291aa4439d88116ae0a234a584152d4318ca339598fda38236984083c4c"} Apr 22 19:09:34.762893 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.762712 2569 scope.go:117] "RemoveContainer" containerID="198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978" Apr 22 19:09:34.763100 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:09:34.763058 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978\": container with ID starting with 198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978 not found: ID does not exist" containerID="198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978" Apr 22 19:09:34.763204 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.763102 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978"} err="failed to get container status \"198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978\": rpc error: code = NotFound desc = could not find container \"198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978\": container with ID starting with 198cfce2242d90f750b81b48ac7d48cfd441d00d0d60b9f5edf02f900f838978 not found: ID does not exist" Apr 22 19:09:34.813022 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.812946 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-image-registry-private-configuration\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813022 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813020 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-trusted-ca\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813069 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-ca-trust-extracted\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813211 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813094 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-certificates\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813539 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813507 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:09:34.813681 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813606 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813739 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813721 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-installation-pull-secrets\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813831 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813786 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrr5f\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-kube-api-access-zrr5f\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.813892 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.813835 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-bound-sa-token\") pod \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\" (UID: \"732f281f-7aeb-4e6e-b46f-a1b52deacf5e\") " Apr 22 19:09:34.814314 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.814074 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-certificates\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.814495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.814468 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:09:34.816033 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.816007 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:34.817237 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.817211 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:34.817783 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.817734 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:09:34.817932 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.817911 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-kube-api-access-zrr5f" (OuterVolumeSpecName: "kube-api-access-zrr5f") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "kube-api-access-zrr5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:34.819274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.819218 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:09:34.824800 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.824774 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "732f281f-7aeb-4e6e-b46f-a1b52deacf5e" (UID: "732f281f-7aeb-4e6e-b46f-a1b52deacf5e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:09:34.915078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915045 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-trusted-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.915078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915077 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-ca-trust-extracted\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.915078 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915093 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-registry-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.915366 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915107 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-installation-pull-secrets\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.915366 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915119 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrr5f\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-kube-api-access-zrr5f\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.915366 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915133 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-bound-sa-token\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:34.915366 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:34.915147 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/732f281f-7aeb-4e6e-b46f-a1b52deacf5e-image-registry-private-configuration\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:09:35.078069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.077967 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cdfc8c88c-qgsqc"] Apr 22 19:09:35.082358 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.082329 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5cdfc8c88c-qgsqc"] Apr 22 19:09:35.763045 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.763012 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerStarted","Data":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} Apr 22 19:09:35.766229 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.766202 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"77302cf7be64a8363e2c1cb8baebbf46a6e49fc0afa1a2fbaf25468066228f4a"} Apr 22 19:09:35.766229 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.766233 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"d2e91533f35a61e731384e06ae63d8a8ca627abf09154c3f0882457d1a4ab6ac"} Apr 22 19:09:35.766426 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.766242 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"a2fb99d9584a5c0dbd8543632191a3bcc8d6a7ca9ec909063b570ee31b187ff6"} Apr 22 19:09:35.766426 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.766251 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" event={"ID":"900eddfd-5ec6-4ec0-93ed-4fd24fe6326c","Type":"ContainerStarted","Data":"031a8cf060afdd10f671396ace7ab9b02363af8855b31140d146e6b3d0777016"} Apr 22 19:09:35.766525 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.766426 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:35.793539 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.793494 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.906108919 podStartE2EDuration="6.793481743s" podCreationTimestamp="2026-04-22 19:09:29 +0000 UTC" firstStartedPulling="2026-04-22 19:09:30.272039561 +0000 UTC m=+178.637454375" lastFinishedPulling="2026-04-22 19:09:35.159412382 +0000 UTC m=+183.524827199" observedRunningTime="2026-04-22 19:09:35.791645826 +0000 UTC m=+184.157060672" watchObservedRunningTime="2026-04-22 19:09:35.793481743 +0000 UTC m=+184.158896636" Apr 22 19:09:35.816298 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:35.816256 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" podStartSLOduration=1.797119796 podStartE2EDuration="4.816240089s" podCreationTimestamp="2026-04-22 19:09:31 +0000 UTC" firstStartedPulling="2026-04-22 19:09:32.141567746 +0000 UTC m=+180.506982559" lastFinishedPulling="2026-04-22 19:09:35.160688024 +0000 UTC m=+183.526102852" observedRunningTime="2026-04-22 19:09:35.814678918 +0000 UTC m=+184.180093798" watchObservedRunningTime="2026-04-22 19:09:35.816240089 +0000 UTC m=+184.181654927" Apr 22 19:09:36.222495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:36.222455 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" path="/var/lib/kubelet/pods/732f281f-7aeb-4e6e-b46f-a1b52deacf5e/volumes" Apr 22 19:09:39.219223 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:39.219195 2569 scope.go:117] "RemoveContainer" containerID="b6db44c319a7c13ca181a4786f33a932cf93b696963dbc4b4af2f372d2fa329e" Apr 22 19:09:39.780392 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:39.780358 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:09:39.780584 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:39.780433 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" event={"ID":"c9d5594b-5dc2-461d-bd58-496386ced33b","Type":"ContainerStarted","Data":"b5d42b7b7efe04454271e71dd1c3ff2f2c149bc18d35920a73ae9c5ab4a1dc5a"} Apr 22 19:09:39.781313 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:39.781287 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:09:39.802540 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:39.802491 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" podStartSLOduration=54.315068216 podStartE2EDuration="57.802478396s" podCreationTimestamp="2026-04-22 19:08:42 +0000 UTC" firstStartedPulling="2026-04-22 19:08:42.606713938 +0000 UTC m=+130.972128752" lastFinishedPulling="2026-04-22 19:08:46.094124111 +0000 UTC m=+134.459538932" observedRunningTime="2026-04-22 19:09:39.801480655 +0000 UTC m=+188.166895491" watchObservedRunningTime="2026-04-22 19:09:39.802478396 +0000 UTC m=+188.167893232" Apr 22 19:09:39.921285 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:39.921254 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fl4n2" Apr 22 19:09:40.111263 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.111190 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-xjm45"] Apr 22 19:09:40.111577 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.111562 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" containerName="registry" Apr 22 19:09:40.111619 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.111579 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" containerName="registry" Apr 22 19:09:40.111655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.111644 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="732f281f-7aeb-4e6e-b46f-a1b52deacf5e" containerName="registry" Apr 22 19:09:40.114744 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.114725 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:09:40.117473 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.117451 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 19:09:40.117549 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.117531 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 19:09:40.117638 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.117621 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wq8nj\"" Apr 22 19:09:40.124492 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.124466 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xjm45"] Apr 22 19:09:40.157047 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.157023 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvlz2\" (UniqueName: \"kubernetes.io/projected/2639e77b-b608-4425-a733-a7915361daa5-kube-api-access-kvlz2\") pod \"downloads-6bcc868b7-xjm45\" (UID: \"2639e77b-b608-4425-a733-a7915361daa5\") " pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:09:40.258268 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.258244 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvlz2\" (UniqueName: \"kubernetes.io/projected/2639e77b-b608-4425-a733-a7915361daa5-kube-api-access-kvlz2\") pod \"downloads-6bcc868b7-xjm45\" (UID: \"2639e77b-b608-4425-a733-a7915361daa5\") " pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:09:40.272103 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.272074 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvlz2\" (UniqueName: \"kubernetes.io/projected/2639e77b-b608-4425-a733-a7915361daa5-kube-api-access-kvlz2\") pod \"downloads-6bcc868b7-xjm45\" (UID: \"2639e77b-b608-4425-a733-a7915361daa5\") " pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:09:40.423808 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.423781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:09:40.545699 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.545664 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-xjm45"] Apr 22 19:09:40.549769 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:40.549721 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2639e77b_b608_4425_a733_a7915361daa5.slice/crio-5ee78429b34e591cfac8911da99473dd3a22d206cbdfe83eb094682ff89c942a WatchSource:0}: Error finding container 5ee78429b34e591cfac8911da99473dd3a22d206cbdfe83eb094682ff89c942a: Status 404 returned error can't find the container with id 5ee78429b34e591cfac8911da99473dd3a22d206cbdfe83eb094682ff89c942a Apr 22 19:09:40.785308 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:40.785223 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xjm45" event={"ID":"2639e77b-b608-4425-a733-a7915361daa5","Type":"ContainerStarted","Data":"5ee78429b34e591cfac8911da99473dd3a22d206cbdfe83eb094682ff89c942a"} Apr 22 19:09:41.776891 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:41.776838 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6897866f4b-mhjxd" Apr 22 19:09:45.575084 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.575051 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58f8d5b7b4-9rfkp"] Apr 22 19:09:45.578717 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.578697 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.582264 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.582240 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 19:09:45.582638 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.582622 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 19:09:45.583559 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.583537 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-ft5w2\"" Apr 22 19:09:45.583828 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.583798 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 19:09:45.583828 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.583797 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 19:09:45.584012 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.583971 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 19:09:45.589532 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.589512 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f8d5b7b4-9rfkp"] Apr 22 19:09:45.704558 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.704519 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-service-ca\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.704558 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.704563 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m24l\" (UniqueName: \"kubernetes.io/projected/b2de1309-5e07-42e4-af50-bd20f6e1989f-kube-api-access-8m24l\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.704732 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.704638 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-config\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.704732 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.704700 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-oauth-serving-cert\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.704732 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.704726 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-serving-cert\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.704854 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.704793 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-oauth-config\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.805487 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.805459 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-service-ca\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.805487 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.805489 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8m24l\" (UniqueName: \"kubernetes.io/projected/b2de1309-5e07-42e4-af50-bd20f6e1989f-kube-api-access-8m24l\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.805676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.805521 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-config\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.805676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.805570 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-oauth-serving-cert\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.805676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.805594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-serving-cert\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.805676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.805634 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-oauth-config\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.806327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.806295 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-service-ca\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.806327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.806322 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-oauth-serving-cert\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.806327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.806306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-config\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.808625 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.808604 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-oauth-config\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.808802 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.808777 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-serving-cert\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.815871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.815848 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m24l\" (UniqueName: \"kubernetes.io/projected/b2de1309-5e07-42e4-af50-bd20f6e1989f-kube-api-access-8m24l\") pod \"console-58f8d5b7b4-9rfkp\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:45.889520 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:45.889450 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:46.040968 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:46.040804 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f8d5b7b4-9rfkp"] Apr 22 19:09:46.043287 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:09:46.043254 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2de1309_5e07_42e4_af50_bd20f6e1989f.slice/crio-9abd51b4d15136ee9f73b6684b032097e6b8c89f63de127606ecf50e45597628 WatchSource:0}: Error finding container 9abd51b4d15136ee9f73b6684b032097e6b8c89f63de127606ecf50e45597628: Status 404 returned error can't find the container with id 9abd51b4d15136ee9f73b6684b032097e6b8c89f63de127606ecf50e45597628 Apr 22 19:09:46.801962 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:46.801922 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f8d5b7b4-9rfkp" event={"ID":"b2de1309-5e07-42e4-af50-bd20f6e1989f","Type":"ContainerStarted","Data":"9abd51b4d15136ee9f73b6684b032097e6b8c89f63de127606ecf50e45597628"} Apr 22 19:09:49.812268 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:49.812184 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f8d5b7b4-9rfkp" event={"ID":"b2de1309-5e07-42e4-af50-bd20f6e1989f","Type":"ContainerStarted","Data":"81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb"} Apr 22 19:09:49.840068 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:49.840020 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58f8d5b7b4-9rfkp" podStartSLOduration=1.39489594 podStartE2EDuration="4.840007454s" podCreationTimestamp="2026-04-22 19:09:45 +0000 UTC" firstStartedPulling="2026-04-22 19:09:46.045489707 +0000 UTC m=+194.410904521" lastFinishedPulling="2026-04-22 19:09:49.490601219 +0000 UTC m=+197.856016035" observedRunningTime="2026-04-22 19:09:49.839295785 +0000 UTC m=+198.204710615" watchObservedRunningTime="2026-04-22 19:09:49.840007454 +0000 UTC m=+198.205422289" Apr 22 19:09:55.216277 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.216244 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64b5fdf68c-flr2n"] Apr 22 19:09:55.222109 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.222082 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.252198 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.252176 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b5fdf68c-flr2n"] Apr 22 19:09:55.275457 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.275424 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 19:09:55.396414 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396373 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-config\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.396588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396445 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-service-ca\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.396588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396539 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-serving-cert\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.396588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396569 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-oauth-serving-cert\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.396749 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-trusted-ca-bundle\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.396749 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396666 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-oauth-config\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.396749 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.396762 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfrjn\" (UniqueName: \"kubernetes.io/projected/289b4459-3cc0-4deb-bf37-4c251c4021d5-kube-api-access-rfrjn\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498077 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.497984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-serving-cert\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498077 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-oauth-serving-cert\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498092 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-trusted-ca-bundle\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498121 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-oauth-config\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498188 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfrjn\" (UniqueName: \"kubernetes.io/projected/289b4459-3cc0-4deb-bf37-4c251c4021d5-kube-api-access-rfrjn\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498227 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-config\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.498296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-service-ca\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.499028 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.498992 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-oauth-serving-cert\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.499198 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.499028 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-trusted-ca-bundle\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.499198 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.499110 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-service-ca\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.499198 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.499151 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-config\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.501210 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.501186 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-serving-cert\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.501325 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.501247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-oauth-config\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.507680 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.507652 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfrjn\" (UniqueName: \"kubernetes.io/projected/289b4459-3cc0-4deb-bf37-4c251c4021d5-kube-api-access-rfrjn\") pod \"console-64b5fdf68c-flr2n\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.533651 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.533618 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:09:55.890457 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.890374 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:55.890457 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.890419 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:55.895861 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:55.895838 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:09:56.838311 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:09:56.838283 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:10:02.216897 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.216870 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64b5fdf68c-flr2n"] Apr 22 19:10:02.220325 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:10:02.220285 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289b4459_3cc0_4deb_bf37_4c251c4021d5.slice/crio-5f62da6544881e10bd1bfdae76c2d3e7269792918f7378056df7a746884c9da7 WatchSource:0}: Error finding container 5f62da6544881e10bd1bfdae76c2d3e7269792918f7378056df7a746884c9da7: Status 404 returned error can't find the container with id 5f62da6544881e10bd1bfdae76c2d3e7269792918f7378056df7a746884c9da7 Apr 22 19:10:02.853579 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.853542 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5fdf68c-flr2n" event={"ID":"289b4459-3cc0-4deb-bf37-4c251c4021d5","Type":"ContainerStarted","Data":"3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b"} Apr 22 19:10:02.853848 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.853804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5fdf68c-flr2n" event={"ID":"289b4459-3cc0-4deb-bf37-4c251c4021d5","Type":"ContainerStarted","Data":"5f62da6544881e10bd1bfdae76c2d3e7269792918f7378056df7a746884c9da7"} Apr 22 19:10:02.855133 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.855103 2569 generic.go:358] "Generic (PLEG): container finished" podID="5676ee9e-cd52-496c-a3cc-32f120c108d4" containerID="02f9f0cfb358771fa22690438d84c815450a16728bf441b531f0b9b933107691" exitCode=0 Apr 22 19:10:02.855235 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.855188 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" event={"ID":"5676ee9e-cd52-496c-a3cc-32f120c108d4","Type":"ContainerDied","Data":"02f9f0cfb358771fa22690438d84c815450a16728bf441b531f0b9b933107691"} Apr 22 19:10:02.855556 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.855536 2569 scope.go:117] "RemoveContainer" containerID="02f9f0cfb358771fa22690438d84c815450a16728bf441b531f0b9b933107691" Apr 22 19:10:02.856964 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.856637 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-xjm45" event={"ID":"2639e77b-b608-4425-a733-a7915361daa5","Type":"ContainerStarted","Data":"86c185d0405e755a9a6b6a172d25287c394b5577fd6ecb1a8cf0ef6c43e7e585"} Apr 22 19:10:02.857132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.857089 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:10:02.871246 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.871222 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-xjm45" Apr 22 19:10:02.878588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.878531 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64b5fdf68c-flr2n" podStartSLOduration=7.878517357 podStartE2EDuration="7.878517357s" podCreationTimestamp="2026-04-22 19:09:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:02.876870182 +0000 UTC m=+211.242285018" watchObservedRunningTime="2026-04-22 19:10:02.878517357 +0000 UTC m=+211.243932192" Apr 22 19:10:02.897422 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:02.897375 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-xjm45" podStartSLOduration=1.295379657 podStartE2EDuration="22.897359287s" podCreationTimestamp="2026-04-22 19:09:40 +0000 UTC" firstStartedPulling="2026-04-22 19:09:40.554143539 +0000 UTC m=+188.919558367" lastFinishedPulling="2026-04-22 19:10:02.156123174 +0000 UTC m=+210.521537997" observedRunningTime="2026-04-22 19:10:02.896110526 +0000 UTC m=+211.261525552" watchObservedRunningTime="2026-04-22 19:10:02.897359287 +0000 UTC m=+211.262774124" Apr 22 19:10:03.862220 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:03.862179 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-k6rkf" event={"ID":"5676ee9e-cd52-496c-a3cc-32f120c108d4","Type":"ContainerStarted","Data":"b85c03132361e75df25e85dd57656be8cb5035efde809ed64e4e14eba2c38c30"} Apr 22 19:10:05.534813 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:05.534766 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:10:05.534813 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:05.534817 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:10:05.540349 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:05.540321 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:10:05.876559 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:05.876471 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:10:05.924227 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:05.924160 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58f8d5b7b4-9rfkp"] Apr 22 19:10:11.891338 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:11.891258 2569 generic.go:358] "Generic (PLEG): container finished" podID="0886168b-fb42-4ca3-81f5-2dabb41537e9" containerID="5bcb243678b43e74160ccf21b44ab0c6b68a217558d88dd65380603a029cb31c" exitCode=0 Apr 22 19:10:11.891729 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:11.891338 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w489z" event={"ID":"0886168b-fb42-4ca3-81f5-2dabb41537e9","Type":"ContainerDied","Data":"5bcb243678b43e74160ccf21b44ab0c6b68a217558d88dd65380603a029cb31c"} Apr 22 19:10:11.891837 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:11.891820 2569 scope.go:117] "RemoveContainer" containerID="5bcb243678b43e74160ccf21b44ab0c6b68a217558d88dd65380603a029cb31c" Apr 22 19:10:12.896546 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:12.896511 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w489z" event={"ID":"0886168b-fb42-4ca3-81f5-2dabb41537e9","Type":"ContainerStarted","Data":"86fe38d6044c728d7a476d98c9d87a3e92277de4c6c3357ca10a296f7b00f8ce"} Apr 22 19:10:17.911488 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:17.911406 2569 generic.go:358] "Generic (PLEG): container finished" podID="64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71" containerID="c5b5f5e696cd81b68240ce5b4d5b91884616a1226909ced520533e99358b6cbc" exitCode=0 Apr 22 19:10:17.911900 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:17.911481 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" event={"ID":"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71","Type":"ContainerDied","Data":"c5b5f5e696cd81b68240ce5b4d5b91884616a1226909ced520533e99358b6cbc"} Apr 22 19:10:17.911900 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:17.911805 2569 scope.go:117] "RemoveContainer" containerID="c5b5f5e696cd81b68240ce5b4d5b91884616a1226909ced520533e99358b6cbc" Apr 22 19:10:18.916483 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:18.916452 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhjdq" event={"ID":"64cf0725-ca9c-41d6-a4dd-8d9f1b74ed71","Type":"ContainerStarted","Data":"531b350d98d18952ac97c0754c0e35b7fe0e55c9459c39b4551daa91c4a2f295"} Apr 22 19:10:30.951652 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:30.951596 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58f8d5b7b4-9rfkp" podUID="b2de1309-5e07-42e4-af50-bd20f6e1989f" containerName="console" containerID="cri-o://81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb" gracePeriod=15 Apr 22 19:10:31.219784 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.219735 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58f8d5b7b4-9rfkp_b2de1309-5e07-42e4-af50-bd20f6e1989f/console/0.log" Apr 22 19:10:31.219902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.219802 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:10:31.322974 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.322941 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-oauth-config\") pod \"b2de1309-5e07-42e4-af50-bd20f6e1989f\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " Apr 22 19:10:31.323122 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323023 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-oauth-serving-cert\") pod \"b2de1309-5e07-42e4-af50-bd20f6e1989f\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " Apr 22 19:10:31.323122 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323056 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-service-ca\") pod \"b2de1309-5e07-42e4-af50-bd20f6e1989f\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " Apr 22 19:10:31.323122 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323077 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-config\") pod \"b2de1309-5e07-42e4-af50-bd20f6e1989f\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " Apr 22 19:10:31.323122 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323105 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-serving-cert\") pod \"b2de1309-5e07-42e4-af50-bd20f6e1989f\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " Apr 22 19:10:31.323274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323124 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m24l\" (UniqueName: \"kubernetes.io/projected/b2de1309-5e07-42e4-af50-bd20f6e1989f-kube-api-access-8m24l\") pod \"b2de1309-5e07-42e4-af50-bd20f6e1989f\" (UID: \"b2de1309-5e07-42e4-af50-bd20f6e1989f\") " Apr 22 19:10:31.323465 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323435 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b2de1309-5e07-42e4-af50-bd20f6e1989f" (UID: "b2de1309-5e07-42e4-af50-bd20f6e1989f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:31.323589 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323467 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-service-ca" (OuterVolumeSpecName: "service-ca") pod "b2de1309-5e07-42e4-af50-bd20f6e1989f" (UID: "b2de1309-5e07-42e4-af50-bd20f6e1989f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:31.323589 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.323479 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-config" (OuterVolumeSpecName: "console-config") pod "b2de1309-5e07-42e4-af50-bd20f6e1989f" (UID: "b2de1309-5e07-42e4-af50-bd20f6e1989f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:31.325444 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.325416 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b2de1309-5e07-42e4-af50-bd20f6e1989f" (UID: "b2de1309-5e07-42e4-af50-bd20f6e1989f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:31.325523 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.325456 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2de1309-5e07-42e4-af50-bd20f6e1989f-kube-api-access-8m24l" (OuterVolumeSpecName: "kube-api-access-8m24l") pod "b2de1309-5e07-42e4-af50-bd20f6e1989f" (UID: "b2de1309-5e07-42e4-af50-bd20f6e1989f"). InnerVolumeSpecName "kube-api-access-8m24l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:31.325523 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.325472 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b2de1309-5e07-42e4-af50-bd20f6e1989f" (UID: "b2de1309-5e07-42e4-af50-bd20f6e1989f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:31.424164 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.424131 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-service-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:31.424164 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.424157 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:31.424164 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.424166 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:31.424384 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.424174 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8m24l\" (UniqueName: \"kubernetes.io/projected/b2de1309-5e07-42e4-af50-bd20f6e1989f-kube-api-access-8m24l\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:31.424384 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.424183 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b2de1309-5e07-42e4-af50-bd20f6e1989f-console-oauth-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:31.424384 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.424192 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b2de1309-5e07-42e4-af50-bd20f6e1989f-oauth-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:31.956915 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.956887 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58f8d5b7b4-9rfkp_b2de1309-5e07-42e4-af50-bd20f6e1989f/console/0.log" Apr 22 19:10:31.957271 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.956925 2569 generic.go:358] "Generic (PLEG): container finished" podID="b2de1309-5e07-42e4-af50-bd20f6e1989f" containerID="81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb" exitCode=2 Apr 22 19:10:31.957271 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.956988 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f8d5b7b4-9rfkp" Apr 22 19:10:31.957271 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.957002 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f8d5b7b4-9rfkp" event={"ID":"b2de1309-5e07-42e4-af50-bd20f6e1989f","Type":"ContainerDied","Data":"81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb"} Apr 22 19:10:31.957271 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.957028 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f8d5b7b4-9rfkp" event={"ID":"b2de1309-5e07-42e4-af50-bd20f6e1989f","Type":"ContainerDied","Data":"9abd51b4d15136ee9f73b6684b032097e6b8c89f63de127606ecf50e45597628"} Apr 22 19:10:31.957271 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.957053 2569 scope.go:117] "RemoveContainer" containerID="81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb" Apr 22 19:10:31.965521 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.965506 2569 scope.go:117] "RemoveContainer" containerID="81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb" Apr 22 19:10:31.965748 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:31.965728 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb\": container with ID starting with 81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb not found: ID does not exist" containerID="81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb" Apr 22 19:10:31.965818 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.965775 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb"} err="failed to get container status \"81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb\": rpc error: code = NotFound desc = could not find container \"81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb\": container with ID starting with 81b6f646025339e7edee30c6a3827db7c5817c1202d6a93e5272ca94f87afceb not found: ID does not exist" Apr 22 19:10:31.978303 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.978277 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58f8d5b7b4-9rfkp"] Apr 22 19:10:31.985866 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:31.983387 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58f8d5b7b4-9rfkp"] Apr 22 19:10:32.222609 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:32.222549 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2de1309-5e07-42e4-af50-bd20f6e1989f" path="/var/lib/kubelet/pods/b2de1309-5e07-42e4-af50-bd20f6e1989f/volumes" Apr 22 19:10:43.927955 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:43.927912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:10:43.930280 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:43.930260 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a3468d-b017-471c-a0df-a07b1c183ff4-metrics-certs\") pod \"network-metrics-daemon-2r8qp\" (UID: \"46a3468d-b017-471c-a0df-a07b1c183ff4\") " pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:10:44.122465 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:44.122437 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zjzgr\"" Apr 22 19:10:44.130463 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:44.130439 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2r8qp" Apr 22 19:10:44.251591 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:44.251511 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2r8qp"] Apr 22 19:10:44.254267 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:10:44.254225 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46a3468d_b017_471c_a0df_a07b1c183ff4.slice/crio-7be9f1c1b6a8c2c275812bdfe0f1c558b24b9111dfb0b0c6a828f949f79716a1 WatchSource:0}: Error finding container 7be9f1c1b6a8c2c275812bdfe0f1c558b24b9111dfb0b0c6a828f949f79716a1: Status 404 returned error can't find the container with id 7be9f1c1b6a8c2c275812bdfe0f1c558b24b9111dfb0b0c6a828f949f79716a1 Apr 22 19:10:44.994563 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:44.994526 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2r8qp" event={"ID":"46a3468d-b017-471c-a0df-a07b1c183ff4","Type":"ContainerStarted","Data":"7be9f1c1b6a8c2c275812bdfe0f1c558b24b9111dfb0b0c6a828f949f79716a1"} Apr 22 19:10:45.999703 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:45.999668 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2r8qp" event={"ID":"46a3468d-b017-471c-a0df-a07b1c183ff4","Type":"ContainerStarted","Data":"793c23927ac010c9dd5207db49634ad7713c86983d069b636bf68e512592d80d"} Apr 22 19:10:45.999703 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:45.999706 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2r8qp" event={"ID":"46a3468d-b017-471c-a0df-a07b1c183ff4","Type":"ContainerStarted","Data":"065a7870936a3d6cd061fe99a1612190e5cefb95423d4cc88943cf0aefec5cd2"} Apr 22 19:10:47.052398 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:47.052347 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2r8qp" podStartSLOduration=253.657469688 podStartE2EDuration="4m15.052332935s" podCreationTimestamp="2026-04-22 19:06:32 +0000 UTC" firstStartedPulling="2026-04-22 19:10:44.256235757 +0000 UTC m=+252.621650572" lastFinishedPulling="2026-04-22 19:10:45.651099005 +0000 UTC m=+254.016513819" observedRunningTime="2026-04-22 19:10:47.049768483 +0000 UTC m=+255.415183310" watchObservedRunningTime="2026-04-22 19:10:47.052332935 +0000 UTC m=+255.417747770" Apr 22 19:10:49.014206 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014170 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:10:49.014626 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014581 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="alertmanager" containerID="cri-o://f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" gracePeriod=120 Apr 22 19:10:49.014682 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014641 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-metric" containerID="cri-o://18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" gracePeriod=120 Apr 22 19:10:49.014739 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014671 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-web" containerID="cri-o://84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" gracePeriod=120 Apr 22 19:10:49.014739 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014691 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="prom-label-proxy" containerID="cri-o://ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" gracePeriod=120 Apr 22 19:10:49.014871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014735 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy" containerID="cri-o://11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" gracePeriod=120 Apr 22 19:10:49.014871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.014775 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="config-reloader" containerID="cri-o://753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" gracePeriod=120 Apr 22 19:10:49.260381 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.260356 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:49.374818 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.374728 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwfqg\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-kube-api-access-dwfqg\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.374818 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.374805 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-main-db\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375008 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.374833 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-tls-assets\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375008 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.374931 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-metrics-client-ca\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375008 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.374993 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375170 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375028 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-web-config\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375170 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375082 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-main-tls\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375170 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375110 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-cluster-tls-config\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375170 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375140 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-trusted-ca-bundle\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375194 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-volume\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375152 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:49.375363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375251 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-web\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375277 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375306 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-out\") pod \"f467273d-4319-4e2b-8284-913a4c5ddd7c\" (UID: \"f467273d-4319-4e2b-8284-913a4c5ddd7c\") " Apr 22 19:10:49.375363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375336 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:49.375662 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375553 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-main-db\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.375662 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.375573 2569 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-metrics-client-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.376449 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.376419 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:49.377925 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.377895 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:49.378368 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.378335 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-kube-api-access-dwfqg" (OuterVolumeSpecName: "kube-api-access-dwfqg") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "kube-api-access-dwfqg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:49.378687 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.378629 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-volume" (OuterVolumeSpecName: "config-volume") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.378807 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.378779 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.378971 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.378945 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.379413 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.379394 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.380029 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.380007 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-out" (OuterVolumeSpecName: "config-out") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:49.380108 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.380005 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.383382 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.383232 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.389129 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.389105 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-web-config" (OuterVolumeSpecName: "web-config") pod "f467273d-4319-4e2b-8284-913a4c5ddd7c" (UID: "f467273d-4319-4e2b-8284-913a4c5ddd7c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:49.476942 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476912 2569 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-cluster-tls-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.476942 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476939 2569 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f467273d-4319-4e2b-8284-913a4c5ddd7c-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476951 2569 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-volume\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476962 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476972 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476981 2569 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f467273d-4319-4e2b-8284-913a4c5ddd7c-config-out\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476990 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwfqg\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-kube-api-access-dwfqg\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.476998 2569 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f467273d-4319-4e2b-8284-913a4c5ddd7c-tls-assets\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.477006 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.477016 2569 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-web-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:49.477120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:49.477024 2569 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f467273d-4319-4e2b-8284-913a4c5ddd7c-secret-alertmanager-main-tls\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:10:50.013596 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013564 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" exitCode=0 Apr 22 19:10:50.013596 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013592 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" exitCode=0 Apr 22 19:10:50.013596 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013599 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" exitCode=0 Apr 22 19:10:50.013596 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013604 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" exitCode=0 Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013610 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" exitCode=0 Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013615 2569 generic.go:358] "Generic (PLEG): container finished" podID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" exitCode=0 Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013688 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013691 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013699 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013718 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013727 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f467273d-4319-4e2b-8284-913a4c5ddd7c","Type":"ContainerDied","Data":"944bf9f318fc71a72b22f5e15108353bb1b43de82b4f2177ac7e26883ca804d6"} Apr 22 19:10:50.013890 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.013773 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.024390 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.024216 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.034836 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.034821 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.041370 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.041345 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.047290 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.047268 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:10:50.048358 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.048333 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.051361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.051342 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:10:50.055606 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.055586 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.062249 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.062234 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.069358 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.069340 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.069622 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.069603 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.069671 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.069633 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} err="failed to get container status \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" Apr 22 19:10:50.069671 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.069652 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.069897 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.069879 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.069966 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.069906 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} err="failed to get container status \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" Apr 22 19:10:50.069966 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.069930 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.070182 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.070165 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.070222 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070197 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} err="failed to get container status \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" Apr 22 19:10:50.070222 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070212 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.070405 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.070391 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.070454 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070408 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} err="failed to get container status \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" Apr 22 19:10:50.070454 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070420 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.070625 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.070611 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.070662 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070628 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} err="failed to get container status \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" Apr 22 19:10:50.070662 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070641 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.070842 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.070828 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.070891 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070845 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} err="failed to get container status \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" Apr 22 19:10:50.070891 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.070857 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.071050 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:10:50.071036 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.071084 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071053 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} err="failed to get container status \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" Apr 22 19:10:50.071084 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071064 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.071244 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071230 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} err="failed to get container status \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" Apr 22 19:10:50.071287 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071245 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.071431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071416 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} err="failed to get container status \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" Apr 22 19:10:50.071431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071430 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.071651 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071631 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} err="failed to get container status \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" Apr 22 19:10:50.071701 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071655 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.071859 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071837 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} err="failed to get container status \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" Apr 22 19:10:50.071925 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.071861 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.072040 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072022 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} err="failed to get container status \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" Apr 22 19:10:50.072098 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072041 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.072254 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072235 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} err="failed to get container status \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" Apr 22 19:10:50.072293 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072256 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.072507 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072489 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} err="failed to get container status \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" Apr 22 19:10:50.072507 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072506 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.072693 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072676 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} err="failed to get container status \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" Apr 22 19:10:50.072730 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072695 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.072913 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072898 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} err="failed to get container status \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" Apr 22 19:10:50.072958 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.072913 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.073083 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073068 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} err="failed to get container status \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" Apr 22 19:10:50.073126 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073083 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.073270 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073256 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} err="failed to get container status \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" Apr 22 19:10:50.073313 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073271 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.073468 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073451 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} err="failed to get container status \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" Apr 22 19:10:50.073515 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073469 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.073659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073642 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} err="failed to get container status \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" Apr 22 19:10:50.073712 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073660 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.073894 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073878 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} err="failed to get container status \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" Apr 22 19:10:50.073894 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.073894 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.074112 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074097 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} err="failed to get container status \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" Apr 22 19:10:50.074112 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074112 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.074306 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074290 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} err="failed to get container status \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" Apr 22 19:10:50.074344 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074307 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.074529 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074493 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} err="failed to get container status \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" Apr 22 19:10:50.074529 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074509 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.074742 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074704 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} err="failed to get container status \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" Apr 22 19:10:50.074742 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.074725 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.075060 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.075037 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} err="failed to get container status \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" Apr 22 19:10:50.075060 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.075060 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.075504 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.075463 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} err="failed to get container status \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" Apr 22 19:10:50.075601 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.075505 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.075785 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.075747 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} err="failed to get container status \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" Apr 22 19:10:50.075853 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.075786 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.076032 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076011 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} err="failed to get container status \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" Apr 22 19:10:50.076032 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076031 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.076312 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076295 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} err="failed to get container status \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" Apr 22 19:10:50.076375 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076313 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.076532 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076513 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} err="failed to get container status \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" Apr 22 19:10:50.076587 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076534 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.076699 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076678 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:10:50.076809 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076789 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} err="failed to get container status \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" Apr 22 19:10:50.076879 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.076809 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.077047 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077033 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy" Apr 22 19:10:50.077088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077050 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy" Apr 22 19:10:50.077088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077018 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} err="failed to get container status \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" Apr 22 19:10:50.077088 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077077 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.077210 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077058 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="prom-label-proxy" Apr 22 19:10:50.077210 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077150 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="prom-label-proxy" Apr 22 19:10:50.077210 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077174 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-web" Apr 22 19:10:50.077210 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077184 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-web" Apr 22 19:10:50.077210 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077207 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-metric" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077215 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-metric" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077238 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="config-reloader" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077246 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="config-reloader" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077259 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2de1309-5e07-42e4-af50-bd20f6e1989f" containerName="console" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077267 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2de1309-5e07-42e4-af50-bd20f6e1989f" containerName="console" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077279 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="alertmanager" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077286 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="alertmanager" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077299 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="init-config-reloader" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077307 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="init-config-reloader" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077329 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} err="failed to get container status \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" Apr 22 19:10:50.077379 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077363 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077386 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2de1309-5e07-42e4-af50-bd20f6e1989f" containerName="console" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077396 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-web" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077406 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy-metric" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077416 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="prom-label-proxy" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077427 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="kube-rbac-proxy" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077435 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="alertmanager" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077440 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" containerName="config-reloader" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077566 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} err="failed to get container status \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" Apr 22 19:10:50.077805 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077579 2569 scope.go:117] "RemoveContainer" containerID="ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99" Apr 22 19:10:50.078130 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077829 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99"} err="failed to get container status \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": rpc error: code = NotFound desc = could not find container \"ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99\": container with ID starting with ca64eeb9d70f52dcdd1fe7b27714a885fa860d789643a419b4d51192b1dbba99 not found: ID does not exist" Apr 22 19:10:50.078130 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.077845 2569 scope.go:117] "RemoveContainer" containerID="18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f" Apr 22 19:10:50.078130 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078030 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f"} err="failed to get container status \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": rpc error: code = NotFound desc = could not find container \"18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f\": container with ID starting with 18dfc85fb1063861979fe1008e5914c44da9f25aa8658b1652270cd5055bb61f not found: ID does not exist" Apr 22 19:10:50.078130 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078045 2569 scope.go:117] "RemoveContainer" containerID="11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0" Apr 22 19:10:50.078280 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078256 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0"} err="failed to get container status \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": rpc error: code = NotFound desc = could not find container \"11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0\": container with ID starting with 11de86327313697b3e69ad892e8264c00cbbe33b683d76708c158eba4649c0b0 not found: ID does not exist" Apr 22 19:10:50.078316 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078280 2569 scope.go:117] "RemoveContainer" containerID="84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4" Apr 22 19:10:50.078475 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078454 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4"} err="failed to get container status \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": rpc error: code = NotFound desc = could not find container \"84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4\": container with ID starting with 84f57462a9478a1643a764253aaf6dbe154596c3cf28d80d5cef47036d6493f4 not found: ID does not exist" Apr 22 19:10:50.078561 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078477 2569 scope.go:117] "RemoveContainer" containerID="753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6" Apr 22 19:10:50.078739 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078714 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6"} err="failed to get container status \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": rpc error: code = NotFound desc = could not find container \"753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6\": container with ID starting with 753f8a2320fca29ce14bc4a0de8186c531389275c93f89ddc98884804a80bca6 not found: ID does not exist" Apr 22 19:10:50.078830 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078740 2569 scope.go:117] "RemoveContainer" containerID="f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f" Apr 22 19:10:50.078999 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078978 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f"} err="failed to get container status \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": rpc error: code = NotFound desc = could not find container \"f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f\": container with ID starting with f06356a56b6859b6621cab2ce7c030987dc57eabda027cca39ca8e5d6730162f not found: ID does not exist" Apr 22 19:10:50.079113 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.078999 2569 scope.go:117] "RemoveContainer" containerID="71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3" Apr 22 19:10:50.079232 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.079214 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3"} err="failed to get container status \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": rpc error: code = NotFound desc = could not find container \"71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3\": container with ID starting with 71a7d8ee18d02f9784ab508fd75da2ad3e0c305be97b29e8741ee384e685d8b3 not found: ID does not exist" Apr 22 19:10:50.083283 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.083266 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.086010 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.085993 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 19:10:50.086102 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.085991 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 19:10:50.086102 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.086049 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 19:10:50.086102 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.086067 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 19:10:50.086365 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.086351 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 19:10:50.086445 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.086430 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xmvg7\"" Apr 22 19:10:50.086542 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.086525 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 19:10:50.087174 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.087158 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 19:10:50.087243 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.087195 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 19:10:50.091446 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.091426 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 19:10:50.094573 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.094556 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:10:50.183417 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183425 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183478 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183497 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183513 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183544 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183533 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-web-config\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183726 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183589 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-config-volume\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183726 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183628 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-config-out\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183726 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183651 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183726 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fph67\" (UniqueName: \"kubernetes.io/projected/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-kube-api-access-fph67\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183872 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183780 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.183872 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.183804 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.222185 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.222159 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f467273d-4319-4e2b-8284-913a4c5ddd7c" path="/var/lib/kubelet/pods/f467273d-4319-4e2b-8284-913a4c5ddd7c/volumes" Apr 22 19:10:50.284910 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.284844 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.284910 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.284879 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.284908 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.284932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.284964 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.284988 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285016 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-web-config\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-config-volume\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-config-out\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285102 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285139 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fph67\" (UniqueName: \"kubernetes.io/projected/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-kube-api-access-fph67\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285361 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285243 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285603 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.285655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.285639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.286569 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.286157 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.288440 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.288417 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.288538 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.288479 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.288687 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.288666 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-config-out\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.288773 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.288728 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-config-volume\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.288884 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.288867 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.288982 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.288962 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.289069 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.289049 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.289409 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.289392 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-web-config\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.290200 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.290182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.295498 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.295474 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fph67\" (UniqueName: \"kubernetes.io/projected/e2e95f49-0d7b-48c1-97ea-b0a519b5248c-kube-api-access-fph67\") pod \"alertmanager-main-0\" (UID: \"e2e95f49-0d7b-48c1-97ea-b0a519b5248c\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.393159 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.393127 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 19:10:50.525471 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:50.525440 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 19:10:50.526701 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:10:50.526676 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2e95f49_0d7b_48c1_97ea_b0a519b5248c.slice/crio-b42330fd1a325f9b8d5bd2ac22b602ebaa35168ec1281205faa950451ec45ff5 WatchSource:0}: Error finding container b42330fd1a325f9b8d5bd2ac22b602ebaa35168ec1281205faa950451ec45ff5: Status 404 returned error can't find the container with id b42330fd1a325f9b8d5bd2ac22b602ebaa35168ec1281205faa950451ec45ff5 Apr 22 19:10:51.018897 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.018861 2569 generic.go:358] "Generic (PLEG): container finished" podID="e2e95f49-0d7b-48c1-97ea-b0a519b5248c" containerID="0e0cf4be67b900e0c1fd651010b5d18e68f29af7eacb0a9e2c8d2dbee3547fbf" exitCode=0 Apr 22 19:10:51.019266 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.018961 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerDied","Data":"0e0cf4be67b900e0c1fd651010b5d18e68f29af7eacb0a9e2c8d2dbee3547fbf"} Apr 22 19:10:51.019266 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.019000 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"b42330fd1a325f9b8d5bd2ac22b602ebaa35168ec1281205faa950451ec45ff5"} Apr 22 19:10:51.484881 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.484810 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-74677c5cb-2hlqs"] Apr 22 19:10:51.488324 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.488297 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.505009 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.504981 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74677c5cb-2hlqs"] Apr 22 19:10:51.597257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597170 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95b7m\" (UniqueName: \"kubernetes.io/projected/f2d2c795-3318-4a43-9535-643bee61b20e-kube-api-access-95b7m\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.597257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597215 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-oauth-config\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.597257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597241 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-console-config\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.597439 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-service-ca\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.597439 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597359 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-oauth-serving-cert\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.597439 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-trusted-ca-bundle\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.597579 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.597473 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-serving-cert\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.698811 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.698779 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-trusted-ca-bundle\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.698946 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.698818 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-serving-cert\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.698946 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.698932 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95b7m\" (UniqueName: \"kubernetes.io/projected/f2d2c795-3318-4a43-9535-643bee61b20e-kube-api-access-95b7m\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699052 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.698975 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-oauth-config\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699052 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.698992 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-console-config\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699052 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.699014 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-service-ca\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699190 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.699065 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-oauth-serving-cert\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699736 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.699707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-trusted-ca-bundle\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699736 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.699725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-oauth-serving-cert\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699959 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.699707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-console-config\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.699959 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.699711 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-service-ca\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.701407 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.701386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-serving-cert\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.701511 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.701423 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-oauth-config\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.708286 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.708267 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95b7m\" (UniqueName: \"kubernetes.io/projected/f2d2c795-3318-4a43-9535-643bee61b20e-kube-api-access-95b7m\") pod \"console-74677c5cb-2hlqs\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.800569 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.800543 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:10:51.918460 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:51.918419 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74677c5cb-2hlqs"] Apr 22 19:10:51.924774 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:10:51.924730 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2d2c795_3318_4a43_9535_643bee61b20e.slice/crio-2862cf6bf8727d4a1d5ce28e8ab121216bed0f1b98bf5dba8f98910c3d775223 WatchSource:0}: Error finding container 2862cf6bf8727d4a1d5ce28e8ab121216bed0f1b98bf5dba8f98910c3d775223: Status 404 returned error can't find the container with id 2862cf6bf8727d4a1d5ce28e8ab121216bed0f1b98bf5dba8f98910c3d775223 Apr 22 19:10:52.026587 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.026546 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"e7fc5975e234d863cb33d9aa1aebbb7c3c83922eb589cefa9d48b64460d95aed"} Apr 22 19:10:52.026587 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.026585 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"82f75fa54b16a3b70342b4a961c2789810925adf98907f0abf3955b505623bc6"} Apr 22 19:10:52.026587 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.026594 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"c04dc85c7b539320247c8b0f5d16ea2c6ac8841d3d996fd269079fc7bc4f375b"} Apr 22 19:10:52.027061 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.026603 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"cfbf5ac883163daac5ce047c69b63c514a63013867ebd661d4cb913615cb18ea"} Apr 22 19:10:52.027061 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.026612 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"e4a87ce08ec8afa58df47210a002cf73cbd946a29531e53148fd3dbd03c9f5c4"} Apr 22 19:10:52.027061 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.026620 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e2e95f49-0d7b-48c1-97ea-b0a519b5248c","Type":"ContainerStarted","Data":"d5e1da9d25b5184915190ff1e7e90ff82ff5895225cbcff325d00edbc730a420"} Apr 22 19:10:52.028414 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.028391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74677c5cb-2hlqs" event={"ID":"f2d2c795-3318-4a43-9535-643bee61b20e","Type":"ContainerStarted","Data":"6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db"} Apr 22 19:10:52.028566 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.028420 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74677c5cb-2hlqs" event={"ID":"f2d2c795-3318-4a43-9535-643bee61b20e","Type":"ContainerStarted","Data":"2862cf6bf8727d4a1d5ce28e8ab121216bed0f1b98bf5dba8f98910c3d775223"} Apr 22 19:10:52.055778 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.055705 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.055691339 podStartE2EDuration="2.055691339s" podCreationTimestamp="2026-04-22 19:10:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:52.054842184 +0000 UTC m=+260.420257020" watchObservedRunningTime="2026-04-22 19:10:52.055691339 +0000 UTC m=+260.421106183" Apr 22 19:10:52.076000 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:10:52.075683 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74677c5cb-2hlqs" podStartSLOduration=1.075663383 podStartE2EDuration="1.075663383s" podCreationTimestamp="2026-04-22 19:10:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:52.075522701 +0000 UTC m=+260.440937538" watchObservedRunningTime="2026-04-22 19:10:52.075663383 +0000 UTC m=+260.441078221" Apr 22 19:11:01.801540 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:01.801502 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:11:01.802021 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:01.801666 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:11:01.806337 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:01.806316 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:11:02.068282 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:02.068201 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:11:02.124659 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:02.124631 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b5fdf68c-flr2n"] Apr 22 19:11:27.150487 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.150428 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64b5fdf68c-flr2n" podUID="289b4459-3cc0-4deb-bf37-4c251c4021d5" containerName="console" containerID="cri-o://3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b" gracePeriod=15 Apr 22 19:11:27.391123 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.391093 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b5fdf68c-flr2n_289b4459-3cc0-4deb-bf37-4c251c4021d5/console/0.log" Apr 22 19:11:27.391319 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.391171 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:11:27.513663 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513631 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-oauth-serving-cert\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.513885 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513684 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-serving-cert\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.513885 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513707 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-oauth-config\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.513885 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513771 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-config\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.513885 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513798 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-service-ca\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.513885 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513829 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-trusted-ca-bundle\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.513885 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.513873 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfrjn\" (UniqueName: \"kubernetes.io/projected/289b4459-3cc0-4deb-bf37-4c251c4021d5-kube-api-access-rfrjn\") pod \"289b4459-3cc0-4deb-bf37-4c251c4021d5\" (UID: \"289b4459-3cc0-4deb-bf37-4c251c4021d5\") " Apr 22 19:11:27.514397 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.514298 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:11:27.514397 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.514317 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-config" (OuterVolumeSpecName: "console-config") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:11:27.514397 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.514369 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-service-ca" (OuterVolumeSpecName: "service-ca") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:11:27.514580 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.514396 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:11:27.516034 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.516010 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:11:27.516254 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.516222 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/289b4459-3cc0-4deb-bf37-4c251c4021d5-kube-api-access-rfrjn" (OuterVolumeSpecName: "kube-api-access-rfrjn") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "kube-api-access-rfrjn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:11:27.516340 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.516252 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "289b4459-3cc0-4deb-bf37-4c251c4021d5" (UID: "289b4459-3cc0-4deb-bf37-4c251c4021d5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:11:27.615388 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615350 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:27.615388 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615379 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-service-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:27.615388 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615387 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-trusted-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:27.615388 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615398 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rfrjn\" (UniqueName: \"kubernetes.io/projected/289b4459-3cc0-4deb-bf37-4c251c4021d5-kube-api-access-rfrjn\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:27.615655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615408 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/289b4459-3cc0-4deb-bf37-4c251c4021d5-oauth-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:27.615655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615417 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:27.615655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:27.615426 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/289b4459-3cc0-4deb-bf37-4c251c4021d5-console-oauth-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:11:28.145109 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.145082 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64b5fdf68c-flr2n_289b4459-3cc0-4deb-bf37-4c251c4021d5/console/0.log" Apr 22 19:11:28.145274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.145122 2569 generic.go:358] "Generic (PLEG): container finished" podID="289b4459-3cc0-4deb-bf37-4c251c4021d5" containerID="3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b" exitCode=2 Apr 22 19:11:28.145274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.145154 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5fdf68c-flr2n" event={"ID":"289b4459-3cc0-4deb-bf37-4c251c4021d5","Type":"ContainerDied","Data":"3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b"} Apr 22 19:11:28.145274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.145183 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64b5fdf68c-flr2n" Apr 22 19:11:28.145274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.145192 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64b5fdf68c-flr2n" event={"ID":"289b4459-3cc0-4deb-bf37-4c251c4021d5","Type":"ContainerDied","Data":"5f62da6544881e10bd1bfdae76c2d3e7269792918f7378056df7a746884c9da7"} Apr 22 19:11:28.145274 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.145210 2569 scope.go:117] "RemoveContainer" containerID="3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b" Apr 22 19:11:28.154026 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.153912 2569 scope.go:117] "RemoveContainer" containerID="3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b" Apr 22 19:11:28.154217 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:11:28.154174 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b\": container with ID starting with 3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b not found: ID does not exist" containerID="3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b" Apr 22 19:11:28.154217 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.154200 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b"} err="failed to get container status \"3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b\": rpc error: code = NotFound desc = could not find container \"3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b\": container with ID starting with 3946a2156ee67b9d8436a9da5edfb3ce1469d7e82f5481b5e69d76bfbad0523b not found: ID does not exist" Apr 22 19:11:28.165275 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.165243 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64b5fdf68c-flr2n"] Apr 22 19:11:28.168514 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.168494 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64b5fdf68c-flr2n"] Apr 22 19:11:28.222827 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:28.222801 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="289b4459-3cc0-4deb-bf37-4c251c4021d5" path="/var/lib/kubelet/pods/289b4459-3cc0-4deb-bf37-4c251c4021d5/volumes" Apr 22 19:11:32.100801 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:32.100772 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:11:32.101280 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:32.100867 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:11:32.108038 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:32.107732 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:11:32.108038 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:32.107731 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:11:32.113449 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:11:32.113429 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 19:14:49.505950 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.505911 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cf87df8dd-wwg7f"] Apr 22 19:14:49.506629 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.506307 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="289b4459-3cc0-4deb-bf37-4c251c4021d5" containerName="console" Apr 22 19:14:49.506629 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.506323 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="289b4459-3cc0-4deb-bf37-4c251c4021d5" containerName="console" Apr 22 19:14:49.506629 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.506402 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="289b4459-3cc0-4deb-bf37-4c251c4021d5" containerName="console" Apr 22 19:14:49.509204 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.509189 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.525933 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.525904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-console-config\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.526045 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.525963 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-service-ca\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.526045 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.526009 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc45760b-8746-44c5-8eed-405319afcc44-console-oauth-config\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.526045 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.526037 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc45760b-8746-44c5-8eed-405319afcc44-console-serving-cert\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.526143 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.526051 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-trusted-ca-bundle\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.526143 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.526070 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-oauth-serving-cert\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.526143 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.526085 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qplg6\" (UniqueName: \"kubernetes.io/projected/bc45760b-8746-44c5-8eed-405319afcc44-kube-api-access-qplg6\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.531472 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.531449 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf87df8dd-wwg7f"] Apr 22 19:14:49.627071 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627034 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc45760b-8746-44c5-8eed-405319afcc44-console-oauth-config\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627083 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc45760b-8746-44c5-8eed-405319afcc44-console-serving-cert\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627125 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-trusted-ca-bundle\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627149 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-oauth-serving-cert\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qplg6\" (UniqueName: \"kubernetes.io/projected/bc45760b-8746-44c5-8eed-405319afcc44-kube-api-access-qplg6\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627222 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-console-config\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627513 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627308 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-service-ca\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.627951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.627913 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-oauth-serving-cert\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.628083 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.628064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-service-ca\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.628153 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.628086 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-trusted-ca-bundle\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.628153 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.628089 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bc45760b-8746-44c5-8eed-405319afcc44-console-config\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.629708 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.629679 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bc45760b-8746-44c5-8eed-405319afcc44-console-oauth-config\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.629898 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.629880 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bc45760b-8746-44c5-8eed-405319afcc44-console-serving-cert\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.639042 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.639023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qplg6\" (UniqueName: \"kubernetes.io/projected/bc45760b-8746-44c5-8eed-405319afcc44-kube-api-access-qplg6\") pod \"console-6cf87df8dd-wwg7f\" (UID: \"bc45760b-8746-44c5-8eed-405319afcc44\") " pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.818279 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.818178 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:49.942022 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.942000 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cf87df8dd-wwg7f"] Apr 22 19:14:49.944645 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:14:49.944610 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc45760b_8746_44c5_8eed_405319afcc44.slice/crio-46bcd3d53d60e1547c5b07fc1b12c7cbcf08293d9e09fe59b086ac991f03158f WatchSource:0}: Error finding container 46bcd3d53d60e1547c5b07fc1b12c7cbcf08293d9e09fe59b086ac991f03158f: Status 404 returned error can't find the container with id 46bcd3d53d60e1547c5b07fc1b12c7cbcf08293d9e09fe59b086ac991f03158f Apr 22 19:14:49.946342 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:49.946324 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:14:50.706578 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:50.706545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf87df8dd-wwg7f" event={"ID":"bc45760b-8746-44c5-8eed-405319afcc44","Type":"ContainerStarted","Data":"eb0dcdc70afe9bd9e6997302a9c6f3019c59cbccb5151cd51e76a59f64dd1412"} Apr 22 19:14:50.706977 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:50.706583 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cf87df8dd-wwg7f" event={"ID":"bc45760b-8746-44c5-8eed-405319afcc44","Type":"ContainerStarted","Data":"46bcd3d53d60e1547c5b07fc1b12c7cbcf08293d9e09fe59b086ac991f03158f"} Apr 22 19:14:50.726363 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:50.726319 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cf87df8dd-wwg7f" podStartSLOduration=1.726304343 podStartE2EDuration="1.726304343s" podCreationTimestamp="2026-04-22 19:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:14:50.72447899 +0000 UTC m=+499.089893827" watchObservedRunningTime="2026-04-22 19:14:50.726304343 +0000 UTC m=+499.091719178" Apr 22 19:14:56.514578 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.514547 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7r27j"] Apr 22 19:14:56.517823 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.517807 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.520496 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.520475 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:14:56.525442 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.525419 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7r27j"] Apr 22 19:14:56.581620 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.581597 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed711052-aeb3-4d51-9f5f-54731300179f-kubelet-config\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.581748 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.581642 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed711052-aeb3-4d51-9f5f-54731300179f-original-pull-secret\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.581748 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.581669 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed711052-aeb3-4d51-9f5f-54731300179f-dbus\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.682867 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.682835 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed711052-aeb3-4d51-9f5f-54731300179f-dbus\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.682979 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.682907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed711052-aeb3-4d51-9f5f-54731300179f-kubelet-config\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.682979 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.682936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed711052-aeb3-4d51-9f5f-54731300179f-original-pull-secret\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.683050 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.683009 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ed711052-aeb3-4d51-9f5f-54731300179f-dbus\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.683050 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.683015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ed711052-aeb3-4d51-9f5f-54731300179f-kubelet-config\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.685271 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.685247 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ed711052-aeb3-4d51-9f5f-54731300179f-original-pull-secret\") pod \"global-pull-secret-syncer-7r27j\" (UID: \"ed711052-aeb3-4d51-9f5f-54731300179f\") " pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.828173 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.828103 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7r27j" Apr 22 19:14:56.947434 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:56.947401 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7r27j"] Apr 22 19:14:56.950327 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:14:56.950284 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded711052_aeb3_4d51_9f5f_54731300179f.slice/crio-1a1f7214a6f560675d364d33f2712aff93b2c74934a95283f4868f6b57882d98 WatchSource:0}: Error finding container 1a1f7214a6f560675d364d33f2712aff93b2c74934a95283f4868f6b57882d98: Status 404 returned error can't find the container with id 1a1f7214a6f560675d364d33f2712aff93b2c74934a95283f4868f6b57882d98 Apr 22 19:14:57.728435 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:57.728391 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7r27j" event={"ID":"ed711052-aeb3-4d51-9f5f-54731300179f","Type":"ContainerStarted","Data":"1a1f7214a6f560675d364d33f2712aff93b2c74934a95283f4868f6b57882d98"} Apr 22 19:14:59.819309 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:59.819214 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:59.819309 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:59.819254 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:14:59.824823 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:14:59.824801 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:15:00.744396 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:00.744364 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cf87df8dd-wwg7f" Apr 22 19:15:00.794499 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:00.794441 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74677c5cb-2hlqs"] Apr 22 19:15:01.744244 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:01.744163 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7r27j" event={"ID":"ed711052-aeb3-4d51-9f5f-54731300179f","Type":"ContainerStarted","Data":"54ead7d0cdeef5673a364ea9a09cc0b8a5de76fbe8467d1f85130cafcd4d0478"} Apr 22 19:15:01.760513 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:01.760461 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7r27j" podStartSLOduration=1.285256409 podStartE2EDuration="5.760447821s" podCreationTimestamp="2026-04-22 19:14:56 +0000 UTC" firstStartedPulling="2026-04-22 19:14:56.952064499 +0000 UTC m=+505.317479327" lastFinishedPulling="2026-04-22 19:15:01.427255925 +0000 UTC m=+509.792670739" observedRunningTime="2026-04-22 19:15:01.760307754 +0000 UTC m=+510.125722601" watchObservedRunningTime="2026-04-22 19:15:01.760447821 +0000 UTC m=+510.125862647" Apr 22 19:15:25.819386 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:25.819284 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-74677c5cb-2hlqs" podUID="f2d2c795-3318-4a43-9535-643bee61b20e" containerName="console" containerID="cri-o://6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db" gracePeriod=15 Apr 22 19:15:26.059068 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.059042 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74677c5cb-2hlqs_f2d2c795-3318-4a43-9535-643bee61b20e/console/0.log" Apr 22 19:15:26.059191 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.059103 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:15:26.244891 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.244856 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-oauth-serving-cert\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245090 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.244925 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-console-config\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245090 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.244950 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-trusted-ca-bundle\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245090 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.244994 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-service-ca\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245090 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245048 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95b7m\" (UniqueName: \"kubernetes.io/projected/f2d2c795-3318-4a43-9535-643bee61b20e-kube-api-access-95b7m\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245092 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-serving-cert\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245120 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-oauth-config\") pod \"f2d2c795-3318-4a43-9535-643bee61b20e\" (UID: \"f2d2c795-3318-4a43-9535-643bee61b20e\") " Apr 22 19:15:26.245391 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245330 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:15:26.245391 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245379 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:15:26.245391 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245372 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-service-ca" (OuterVolumeSpecName: "service-ca") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:15:26.245583 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.245557 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-console-config" (OuterVolumeSpecName: "console-config") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:15:26.247394 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.247371 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:15:26.247490 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.247459 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:15:26.247530 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.247507 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2d2c795-3318-4a43-9535-643bee61b20e-kube-api-access-95b7m" (OuterVolumeSpecName: "kube-api-access-95b7m") pod "f2d2c795-3318-4a43-9535-643bee61b20e" (UID: "f2d2c795-3318-4a43-9535-643bee61b20e"). InnerVolumeSpecName "kube-api-access-95b7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:15:26.346700 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346663 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-95b7m\" (UniqueName: \"kubernetes.io/projected/f2d2c795-3318-4a43-9535-643bee61b20e-kube-api-access-95b7m\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.346700 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346694 2569 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.346700 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346708 2569 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2d2c795-3318-4a43-9535-643bee61b20e-console-oauth-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.346951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346722 2569 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-oauth-serving-cert\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.346951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346732 2569 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-console-config\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.346951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346740 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-trusted-ca-bundle\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.346951 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.346769 2569 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2d2c795-3318-4a43-9535-643bee61b20e-service-ca\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:15:26.821431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.821406 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-74677c5cb-2hlqs_f2d2c795-3318-4a43-9535-643bee61b20e/console/0.log" Apr 22 19:15:26.821871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.821447 2569 generic.go:358] "Generic (PLEG): container finished" podID="f2d2c795-3318-4a43-9535-643bee61b20e" containerID="6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db" exitCode=2 Apr 22 19:15:26.821871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.821484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74677c5cb-2hlqs" event={"ID":"f2d2c795-3318-4a43-9535-643bee61b20e","Type":"ContainerDied","Data":"6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db"} Apr 22 19:15:26.821871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.821516 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74677c5cb-2hlqs" Apr 22 19:15:26.821871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.821523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74677c5cb-2hlqs" event={"ID":"f2d2c795-3318-4a43-9535-643bee61b20e","Type":"ContainerDied","Data":"2862cf6bf8727d4a1d5ce28e8ab121216bed0f1b98bf5dba8f98910c3d775223"} Apr 22 19:15:26.821871 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.821540 2569 scope.go:117] "RemoveContainer" containerID="6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db" Apr 22 19:15:26.830656 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.830639 2569 scope.go:117] "RemoveContainer" containerID="6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db" Apr 22 19:15:26.830935 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:15:26.830915 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db\": container with ID starting with 6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db not found: ID does not exist" containerID="6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db" Apr 22 19:15:26.830993 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.830949 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db"} err="failed to get container status \"6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db\": rpc error: code = NotFound desc = could not find container \"6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db\": container with ID starting with 6a4f91b4a86792ebcceafe1406b98afa2e24a4cedbfa7a423f56000c018b98db not found: ID does not exist" Apr 22 19:15:26.842245 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.842219 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-74677c5cb-2hlqs"] Apr 22 19:15:26.844163 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:26.844142 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-74677c5cb-2hlqs"] Apr 22 19:15:28.223518 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:28.223479 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2d2c795-3318-4a43-9535-643bee61b20e" path="/var/lib/kubelet/pods/f2d2c795-3318-4a43-9535-643bee61b20e/volumes" Apr 22 19:15:36.443855 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.443822 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-np7hw"] Apr 22 19:15:36.444237 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.444116 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2d2c795-3318-4a43-9535-643bee61b20e" containerName="console" Apr 22 19:15:36.444237 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.444127 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2d2c795-3318-4a43-9535-643bee61b20e" containerName="console" Apr 22 19:15:36.444237 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.444191 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2d2c795-3318-4a43-9535-643bee61b20e" containerName="console" Apr 22 19:15:36.449289 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.449273 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.451846 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.451823 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-ghb47\"" Apr 22 19:15:36.451973 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.451870 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 19:15:36.452912 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.452895 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 19:15:36.457505 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.457485 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-np7hw"] Apr 22 19:15:36.528640 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.528615 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2970749a-9e51-4a47-8c73-e3fb046e2bca-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-np7hw\" (UID: \"2970749a-9e51-4a47-8c73-e3fb046e2bca\") " pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.528815 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.528709 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xq48\" (UniqueName: \"kubernetes.io/projected/2970749a-9e51-4a47-8c73-e3fb046e2bca-kube-api-access-2xq48\") pod \"cert-manager-webhook-587ccfb98-np7hw\" (UID: \"2970749a-9e51-4a47-8c73-e3fb046e2bca\") " pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.629108 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.629079 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xq48\" (UniqueName: \"kubernetes.io/projected/2970749a-9e51-4a47-8c73-e3fb046e2bca-kube-api-access-2xq48\") pod \"cert-manager-webhook-587ccfb98-np7hw\" (UID: \"2970749a-9e51-4a47-8c73-e3fb046e2bca\") " pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.629231 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.629119 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2970749a-9e51-4a47-8c73-e3fb046e2bca-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-np7hw\" (UID: \"2970749a-9e51-4a47-8c73-e3fb046e2bca\") " pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.637700 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.637669 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2970749a-9e51-4a47-8c73-e3fb046e2bca-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-np7hw\" (UID: \"2970749a-9e51-4a47-8c73-e3fb046e2bca\") " pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.637818 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.637724 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xq48\" (UniqueName: \"kubernetes.io/projected/2970749a-9e51-4a47-8c73-e3fb046e2bca-kube-api-access-2xq48\") pod \"cert-manager-webhook-587ccfb98-np7hw\" (UID: \"2970749a-9e51-4a47-8c73-e3fb046e2bca\") " pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.758788 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.758696 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:36.882358 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:36.882335 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-np7hw"] Apr 22 19:15:36.884921 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:15:36.884895 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2970749a_9e51_4a47_8c73_e3fb046e2bca.slice/crio-f0d3ee9b8c7ce4309718890218f9544bdb30a7583321564da8afc4f1b9ce9ba2 WatchSource:0}: Error finding container f0d3ee9b8c7ce4309718890218f9544bdb30a7583321564da8afc4f1b9ce9ba2: Status 404 returned error can't find the container with id f0d3ee9b8c7ce4309718890218f9544bdb30a7583321564da8afc4f1b9ce9ba2 Apr 22 19:15:37.856642 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:37.856570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" event={"ID":"2970749a-9e51-4a47-8c73-e3fb046e2bca","Type":"ContainerStarted","Data":"f0d3ee9b8c7ce4309718890218f9544bdb30a7583321564da8afc4f1b9ce9ba2"} Apr 22 19:15:38.695429 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.695395 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-cgqvt"] Apr 22 19:15:38.700003 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.699985 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:38.702873 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.702849 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-wgtvk\"" Apr 22 19:15:38.709875 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.709853 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-cgqvt"] Apr 22 19:15:38.748128 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.748096 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdhm7\" (UniqueName: \"kubernetes.io/projected/efd48ad1-daef-4219-a16e-32f82f71dcc1-kube-api-access-vdhm7\") pod \"cert-manager-cainjector-68b757865b-cgqvt\" (UID: \"efd48ad1-daef-4219-a16e-32f82f71dcc1\") " pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:38.748288 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.748169 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efd48ad1-daef-4219-a16e-32f82f71dcc1-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-cgqvt\" (UID: \"efd48ad1-daef-4219-a16e-32f82f71dcc1\") " pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:38.849514 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.849478 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efd48ad1-daef-4219-a16e-32f82f71dcc1-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-cgqvt\" (UID: \"efd48ad1-daef-4219-a16e-32f82f71dcc1\") " pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:38.849682 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.849552 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdhm7\" (UniqueName: \"kubernetes.io/projected/efd48ad1-daef-4219-a16e-32f82f71dcc1-kube-api-access-vdhm7\") pod \"cert-manager-cainjector-68b757865b-cgqvt\" (UID: \"efd48ad1-daef-4219-a16e-32f82f71dcc1\") " pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:38.862127 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.862100 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/efd48ad1-daef-4219-a16e-32f82f71dcc1-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-cgqvt\" (UID: \"efd48ad1-daef-4219-a16e-32f82f71dcc1\") " pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:38.862482 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:38.862263 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdhm7\" (UniqueName: \"kubernetes.io/projected/efd48ad1-daef-4219-a16e-32f82f71dcc1-kube-api-access-vdhm7\") pod \"cert-manager-cainjector-68b757865b-cgqvt\" (UID: \"efd48ad1-daef-4219-a16e-32f82f71dcc1\") " pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:39.011869 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:39.011781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" Apr 22 19:15:39.864690 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:39.864661 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-cgqvt"] Apr 22 19:15:39.867694 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:15:39.867661 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefd48ad1_daef_4219_a16e_32f82f71dcc1.slice/crio-2ac3f03ffec940fd5c9d52cb6c2f0d74d3483db34441f236987f002d80e6bc36 WatchSource:0}: Error finding container 2ac3f03ffec940fd5c9d52cb6c2f0d74d3483db34441f236987f002d80e6bc36: Status 404 returned error can't find the container with id 2ac3f03ffec940fd5c9d52cb6c2f0d74d3483db34441f236987f002d80e6bc36 Apr 22 19:15:40.866569 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:40.866535 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" event={"ID":"efd48ad1-daef-4219-a16e-32f82f71dcc1","Type":"ContainerStarted","Data":"da5753b10dffd5939bef39a09fd39d96f173673c0de171feb1e696352dd6d3d1"} Apr 22 19:15:40.866569 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:40.866570 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" event={"ID":"efd48ad1-daef-4219-a16e-32f82f71dcc1","Type":"ContainerStarted","Data":"2ac3f03ffec940fd5c9d52cb6c2f0d74d3483db34441f236987f002d80e6bc36"} Apr 22 19:15:40.867914 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:40.867884 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" event={"ID":"2970749a-9e51-4a47-8c73-e3fb046e2bca","Type":"ContainerStarted","Data":"50213301a34e093230ed660d7f3261b919122ca29f26bab9b0f357c53efad942"} Apr 22 19:15:40.868010 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:40.867996 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:15:40.906488 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:40.903308 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-cgqvt" podStartSLOduration=2.9032893939999997 podStartE2EDuration="2.903289394s" podCreationTimestamp="2026-04-22 19:15:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:15:40.900014052 +0000 UTC m=+549.265428888" watchObservedRunningTime="2026-04-22 19:15:40.903289394 +0000 UTC m=+549.268704234" Apr 22 19:15:40.925562 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:40.925501 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" podStartSLOduration=2.005106172 podStartE2EDuration="4.925484174s" podCreationTimestamp="2026-04-22 19:15:36 +0000 UTC" firstStartedPulling="2026-04-22 19:15:36.887203551 +0000 UTC m=+545.252618380" lastFinishedPulling="2026-04-22 19:15:39.807581554 +0000 UTC m=+548.172996382" observedRunningTime="2026-04-22 19:15:40.923630919 +0000 UTC m=+549.289045756" watchObservedRunningTime="2026-04-22 19:15:40.925484174 +0000 UTC m=+549.290899011" Apr 22 19:15:46.873010 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:15:46.872980 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-np7hw" Apr 22 19:16:06.808208 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.808173 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh"] Apr 22 19:16:06.811442 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.811426 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.821676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.821658 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 19:16:06.824077 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.824058 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 19:16:06.824077 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.824070 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 19:16:06.824250 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.824070 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 19:16:06.824380 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.824366 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-dvr7t\"" Apr 22 19:16:06.849110 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.849082 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh"] Apr 22 19:16:06.863808 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.863785 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d246c5f2-f55d-478c-84f5-ee7646639611-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.863941 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.863834 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d246c5f2-f55d-478c-84f5-ee7646639611-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.863941 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.863864 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggpvg\" (UniqueName: \"kubernetes.io/projected/d246c5f2-f55d-478c-84f5-ee7646639611-kube-api-access-ggpvg\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.964493 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.964464 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d246c5f2-f55d-478c-84f5-ee7646639611-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.964663 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.964517 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggpvg\" (UniqueName: \"kubernetes.io/projected/d246c5f2-f55d-478c-84f5-ee7646639611-kube-api-access-ggpvg\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.964663 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.964581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d246c5f2-f55d-478c-84f5-ee7646639611-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.967094 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.967071 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d246c5f2-f55d-478c-84f5-ee7646639611-webhook-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.967203 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.967117 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d246c5f2-f55d-478c-84f5-ee7646639611-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:06.978990 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:06.978964 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggpvg\" (UniqueName: \"kubernetes.io/projected/d246c5f2-f55d-478c-84f5-ee7646639611-kube-api-access-ggpvg\") pod \"opendatahub-operator-controller-manager-6c9fd8c974-zbbkh\" (UID: \"d246c5f2-f55d-478c-84f5-ee7646639611\") " pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:07.121566 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:07.121470 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:07.255705 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:07.255676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh"] Apr 22 19:16:07.257726 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:16:07.257699 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd246c5f2_f55d_478c_84f5_ee7646639611.slice/crio-cc41cb907080df85b1b4ff5d8705e11ba13f6d962cb8c98cb26e03038783407f WatchSource:0}: Error finding container cc41cb907080df85b1b4ff5d8705e11ba13f6d962cb8c98cb26e03038783407f: Status 404 returned error can't find the container with id cc41cb907080df85b1b4ff5d8705e11ba13f6d962cb8c98cb26e03038783407f Apr 22 19:16:07.954464 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:07.954423 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" event={"ID":"d246c5f2-f55d-478c-84f5-ee7646639611","Type":"ContainerStarted","Data":"cc41cb907080df85b1b4ff5d8705e11ba13f6d962cb8c98cb26e03038783407f"} Apr 22 19:16:10.967365 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:10.967329 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" event={"ID":"d246c5f2-f55d-478c-84f5-ee7646639611","Type":"ContainerStarted","Data":"c829d246999ad3db75116ecbe8328ed6bd954704058b5b0ecdbdb325df5fec4e"} Apr 22 19:16:10.967730 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:10.967473 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:10.993655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:10.993604 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" podStartSLOduration=2.155591517 podStartE2EDuration="4.993590985s" podCreationTimestamp="2026-04-22 19:16:06 +0000 UTC" firstStartedPulling="2026-04-22 19:16:07.259435579 +0000 UTC m=+575.624850393" lastFinishedPulling="2026-04-22 19:16:10.097435028 +0000 UTC m=+578.462849861" observedRunningTime="2026-04-22 19:16:10.991956466 +0000 UTC m=+579.357371302" watchObservedRunningTime="2026-04-22 19:16:10.993590985 +0000 UTC m=+579.359005822" Apr 22 19:16:21.972769 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:21.972718 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6c9fd8c974-zbbkh" Apr 22 19:16:25.397966 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.397932 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-77597c7855-76svf"] Apr 22 19:16:25.401149 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.401133 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.403668 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.403650 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-9nv99\"" Apr 22 19:16:25.404646 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.404619 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 19:16:25.404733 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.404629 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 19:16:25.404733 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.404688 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 19:16:25.404733 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.404629 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 19:16:25.411964 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.411945 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-77597c7855-76svf"] Apr 22 19:16:25.511234 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.511201 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btjhz\" (UniqueName: \"kubernetes.io/projected/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-kube-api-access-btjhz\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.511234 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.511236 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-tls-certs\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.511418 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.511327 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-tmp\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.612538 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.612494 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-tmp\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.612701 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.612564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btjhz\" (UniqueName: \"kubernetes.io/projected/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-kube-api-access-btjhz\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.612701 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.612599 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-tls-certs\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.615082 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.615059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-tmp\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.615228 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.615211 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-tls-certs\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.620936 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.620915 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btjhz\" (UniqueName: \"kubernetes.io/projected/1d8494de-a95f-42d3-9a4d-dac04e26d2b9-kube-api-access-btjhz\") pod \"kube-auth-proxy-77597c7855-76svf\" (UID: \"1d8494de-a95f-42d3-9a4d-dac04e26d2b9\") " pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.711662 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.711636 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" Apr 22 19:16:25.831823 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:25.831790 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-77597c7855-76svf"] Apr 22 19:16:25.834587 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:16:25.834558 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d8494de_a95f_42d3_9a4d_dac04e26d2b9.slice/crio-c1cc64726e5d2debc639d2811e8a68450b2052ac7b37947d18488dbfdb21fa50 WatchSource:0}: Error finding container c1cc64726e5d2debc639d2811e8a68450b2052ac7b37947d18488dbfdb21fa50: Status 404 returned error can't find the container with id c1cc64726e5d2debc639d2811e8a68450b2052ac7b37947d18488dbfdb21fa50 Apr 22 19:16:26.017587 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:26.017512 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" event={"ID":"1d8494de-a95f-42d3-9a4d-dac04e26d2b9","Type":"ContainerStarted","Data":"c1cc64726e5d2debc639d2811e8a68450b2052ac7b37947d18488dbfdb21fa50"} Apr 22 19:16:28.406747 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.406717 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-nlv9w"] Apr 22 19:16:28.409916 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.409900 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:28.412430 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.412410 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 19:16:28.412430 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.412427 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-fbl99\"" Apr 22 19:16:28.417558 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.417120 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-nlv9w"] Apr 22 19:16:28.536974 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.536936 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab59fdb-1873-4f91-8414-d788d9fec057-cert\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:28.537132 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.536998 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httrg\" (UniqueName: \"kubernetes.io/projected/5ab59fdb-1873-4f91-8414-d788d9fec057-kube-api-access-httrg\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:28.638221 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.638179 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-httrg\" (UniqueName: \"kubernetes.io/projected/5ab59fdb-1873-4f91-8414-d788d9fec057-kube-api-access-httrg\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:28.638415 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.638302 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab59fdb-1873-4f91-8414-d788d9fec057-cert\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:28.638415 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:16:28.638411 2569 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 19:16:28.638522 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:16:28.638475 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ab59fdb-1873-4f91-8414-d788d9fec057-cert podName:5ab59fdb-1873-4f91-8414-d788d9fec057 nodeName:}" failed. No retries permitted until 2026-04-22 19:16:29.138451387 +0000 UTC m=+597.503866221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ab59fdb-1873-4f91-8414-d788d9fec057-cert") pod "odh-model-controller-858dbf95b8-nlv9w" (UID: "5ab59fdb-1873-4f91-8414-d788d9fec057") : secret "odh-model-controller-webhook-cert" not found Apr 22 19:16:28.657411 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:28.657342 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-httrg\" (UniqueName: \"kubernetes.io/projected/5ab59fdb-1873-4f91-8414-d788d9fec057-kube-api-access-httrg\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:29.143719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:29.143683 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab59fdb-1873-4f91-8414-d788d9fec057-cert\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:29.146657 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:29.146626 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ab59fdb-1873-4f91-8414-d788d9fec057-cert\") pod \"odh-model-controller-858dbf95b8-nlv9w\" (UID: \"5ab59fdb-1873-4f91-8414-d788d9fec057\") " pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:29.321135 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:29.321094 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:29.842391 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:29.842355 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-nlv9w"] Apr 22 19:16:29.846027 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:16:29.846002 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab59fdb_1873_4f91_8414_d788d9fec057.slice/crio-3c4cf85733c42380a173a9b372dae60f130ea63f78e15210466eb0b3972886ab WatchSource:0}: Error finding container 3c4cf85733c42380a173a9b372dae60f130ea63f78e15210466eb0b3972886ab: Status 404 returned error can't find the container with id 3c4cf85733c42380a173a9b372dae60f130ea63f78e15210466eb0b3972886ab Apr 22 19:16:30.033127 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:30.033020 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" event={"ID":"5ab59fdb-1873-4f91-8414-d788d9fec057","Type":"ContainerStarted","Data":"3c4cf85733c42380a173a9b372dae60f130ea63f78e15210466eb0b3972886ab"} Apr 22 19:16:30.034228 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:30.034203 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" event={"ID":"1d8494de-a95f-42d3-9a4d-dac04e26d2b9","Type":"ContainerStarted","Data":"1e7c7d95c4cbc5b545efbf4702cdc715b6c0ca261b5224639f80bc7746bae316"} Apr 22 19:16:30.051979 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:30.051928 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-77597c7855-76svf" podStartSLOduration=1.12107599 podStartE2EDuration="5.051916106s" podCreationTimestamp="2026-04-22 19:16:25 +0000 UTC" firstStartedPulling="2026-04-22 19:16:25.836502247 +0000 UTC m=+594.201917060" lastFinishedPulling="2026-04-22 19:16:29.767342347 +0000 UTC m=+598.132757176" observedRunningTime="2026-04-22 19:16:30.049546845 +0000 UTC m=+598.414961676" watchObservedRunningTime="2026-04-22 19:16:30.051916106 +0000 UTC m=+598.417330920" Apr 22 19:16:32.126665 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:32.126640 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:16:32.127225 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:32.127208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:16:32.132441 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:32.132424 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:16:32.133181 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:32.133161 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:16:34.050164 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.050130 2569 generic.go:358] "Generic (PLEG): container finished" podID="5ab59fdb-1873-4f91-8414-d788d9fec057" containerID="b5d7cd0743bb054e160c317d0a22c1a9c8271fa417c8e2405ea87fde88462772" exitCode=1 Apr 22 19:16:34.050541 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.050181 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" event={"ID":"5ab59fdb-1873-4f91-8414-d788d9fec057","Type":"ContainerDied","Data":"b5d7cd0743bb054e160c317d0a22c1a9c8271fa417c8e2405ea87fde88462772"} Apr 22 19:16:34.050541 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.050437 2569 scope.go:117] "RemoveContainer" containerID="b5d7cd0743bb054e160c317d0a22c1a9c8271fa417c8e2405ea87fde88462772" Apr 22 19:16:34.326792 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.326705 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7x9f9"] Apr 22 19:16:34.330003 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.329987 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:34.332624 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.332594 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 19:16:34.332727 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.332601 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-vvg8p\"" Apr 22 19:16:34.341744 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.341718 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7x9f9"] Apr 22 19:16:34.493655 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.493625 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e9696d-09aa-4774-b2f1-5b73cf9ab409-cert\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:34.493849 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.493696 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqmh4\" (UniqueName: \"kubernetes.io/projected/63e9696d-09aa-4774-b2f1-5b73cf9ab409-kube-api-access-kqmh4\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:34.594939 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.594868 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e9696d-09aa-4774-b2f1-5b73cf9ab409-cert\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:34.595064 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.594953 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqmh4\" (UniqueName: \"kubernetes.io/projected/63e9696d-09aa-4774-b2f1-5b73cf9ab409-kube-api-access-kqmh4\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:34.595064 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:16:34.595027 2569 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 19:16:34.595133 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:16:34.595098 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63e9696d-09aa-4774-b2f1-5b73cf9ab409-cert podName:63e9696d-09aa-4774-b2f1-5b73cf9ab409 nodeName:}" failed. No retries permitted until 2026-04-22 19:16:35.095078142 +0000 UTC m=+603.460492957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/63e9696d-09aa-4774-b2f1-5b73cf9ab409-cert") pod "kserve-controller-manager-856948b99f-7x9f9" (UID: "63e9696d-09aa-4774-b2f1-5b73cf9ab409") : secret "kserve-webhook-server-cert" not found Apr 22 19:16:34.610053 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:34.610023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqmh4\" (UniqueName: \"kubernetes.io/projected/63e9696d-09aa-4774-b2f1-5b73cf9ab409-kube-api-access-kqmh4\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:35.054676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.054648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" event={"ID":"5ab59fdb-1873-4f91-8414-d788d9fec057","Type":"ContainerStarted","Data":"1fc86c34b1d4350448401258e5d7fd5c0a68f04e9023b3c3f8a2e8190a49dd2f"} Apr 22 19:16:35.055113 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.054816 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:35.078906 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.078855 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" podStartSLOduration=2.5476094419999997 podStartE2EDuration="7.078835519s" podCreationTimestamp="2026-04-22 19:16:28 +0000 UTC" firstStartedPulling="2026-04-22 19:16:29.84747168 +0000 UTC m=+598.212886511" lastFinishedPulling="2026-04-22 19:16:34.378697773 +0000 UTC m=+602.744112588" observedRunningTime="2026-04-22 19:16:35.077798847 +0000 UTC m=+603.443213683" watchObservedRunningTime="2026-04-22 19:16:35.078835519 +0000 UTC m=+603.444250354" Apr 22 19:16:35.098015 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.097987 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e9696d-09aa-4774-b2f1-5b73cf9ab409-cert\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:35.100471 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.100453 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/63e9696d-09aa-4774-b2f1-5b73cf9ab409-cert\") pod \"kserve-controller-manager-856948b99f-7x9f9\" (UID: \"63e9696d-09aa-4774-b2f1-5b73cf9ab409\") " pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:35.241633 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.241602 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:35.369485 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:35.369404 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-7x9f9"] Apr 22 19:16:35.372326 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:16:35.372294 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63e9696d_09aa_4774_b2f1_5b73cf9ab409.slice/crio-65d9639bc806ff0eb290f9af994c745f72b7c44a127ef18f87a9fcc12ba16830 WatchSource:0}: Error finding container 65d9639bc806ff0eb290f9af994c745f72b7c44a127ef18f87a9fcc12ba16830: Status 404 returned error can't find the container with id 65d9639bc806ff0eb290f9af994c745f72b7c44a127ef18f87a9fcc12ba16830 Apr 22 19:16:36.059394 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:36.059355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" event={"ID":"63e9696d-09aa-4774-b2f1-5b73cf9ab409","Type":"ContainerStarted","Data":"65d9639bc806ff0eb290f9af994c745f72b7c44a127ef18f87a9fcc12ba16830"} Apr 22 19:16:39.071333 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:39.071293 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" event={"ID":"63e9696d-09aa-4774-b2f1-5b73cf9ab409","Type":"ContainerStarted","Data":"3e92cdf2e25b1206f22a724fc2c9b35c777629b533182a1d916b9dd0f32e228e"} Apr 22 19:16:39.071716 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:39.071466 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:16:39.090582 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:39.090346 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" podStartSLOduration=2.258271209 podStartE2EDuration="5.090333561s" podCreationTimestamp="2026-04-22 19:16:34 +0000 UTC" firstStartedPulling="2026-04-22 19:16:35.373591707 +0000 UTC m=+603.739006521" lastFinishedPulling="2026-04-22 19:16:38.205654058 +0000 UTC m=+606.571068873" observedRunningTime="2026-04-22 19:16:39.090205292 +0000 UTC m=+607.455620128" watchObservedRunningTime="2026-04-22 19:16:39.090333561 +0000 UTC m=+607.455748397" Apr 22 19:16:44.449586 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.449552 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-qglff"] Apr 22 19:16:44.458214 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.458193 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.461103 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.461079 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 19:16:44.461229 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.461118 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 19:16:44.461298 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.461241 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-vjshc\"" Apr 22 19:16:44.466952 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.466706 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-qglff"] Apr 22 19:16:44.472207 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.472182 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c482bea5-1b2b-4ad6-97c6-bdbcce31f396-operator-config\") pod \"servicemesh-operator3-55f49c5f94-qglff\" (UID: \"c482bea5-1b2b-4ad6-97c6-bdbcce31f396\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.472297 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.472282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szt86\" (UniqueName: \"kubernetes.io/projected/c482bea5-1b2b-4ad6-97c6-bdbcce31f396-kube-api-access-szt86\") pod \"servicemesh-operator3-55f49c5f94-qglff\" (UID: \"c482bea5-1b2b-4ad6-97c6-bdbcce31f396\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.573535 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.573504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szt86\" (UniqueName: \"kubernetes.io/projected/c482bea5-1b2b-4ad6-97c6-bdbcce31f396-kube-api-access-szt86\") pod \"servicemesh-operator3-55f49c5f94-qglff\" (UID: \"c482bea5-1b2b-4ad6-97c6-bdbcce31f396\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.573666 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.573569 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c482bea5-1b2b-4ad6-97c6-bdbcce31f396-operator-config\") pod \"servicemesh-operator3-55f49c5f94-qglff\" (UID: \"c482bea5-1b2b-4ad6-97c6-bdbcce31f396\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.576259 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.576236 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/c482bea5-1b2b-4ad6-97c6-bdbcce31f396-operator-config\") pod \"servicemesh-operator3-55f49c5f94-qglff\" (UID: \"c482bea5-1b2b-4ad6-97c6-bdbcce31f396\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.585335 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.585314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szt86\" (UniqueName: \"kubernetes.io/projected/c482bea5-1b2b-4ad6-97c6-bdbcce31f396-kube-api-access-szt86\") pod \"servicemesh-operator3-55f49c5f94-qglff\" (UID: \"c482bea5-1b2b-4ad6-97c6-bdbcce31f396\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.770162 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.770084 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:44.921977 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:44.919782 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-qglff"] Apr 22 19:16:44.925883 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:16:44.925847 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc482bea5_1b2b_4ad6_97c6_bdbcce31f396.slice/crio-1f213fa4d2602997018d9b8399612a8f15eddca7b1c6e384657b1883658de5e6 WatchSource:0}: Error finding container 1f213fa4d2602997018d9b8399612a8f15eddca7b1c6e384657b1883658de5e6: Status 404 returned error can't find the container with id 1f213fa4d2602997018d9b8399612a8f15eddca7b1c6e384657b1883658de5e6 Apr 22 19:16:45.093882 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:45.093796 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" event={"ID":"c482bea5-1b2b-4ad6-97c6-bdbcce31f396","Type":"ContainerStarted","Data":"1f213fa4d2602997018d9b8399612a8f15eddca7b1c6e384657b1883658de5e6"} Apr 22 19:16:46.062648 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:46.062616 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-nlv9w" Apr 22 19:16:48.106690 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.106654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" event={"ID":"c482bea5-1b2b-4ad6-97c6-bdbcce31f396","Type":"ContainerStarted","Data":"221ce875138c6733ba0a4e9b13fa2f32fc2f3271e551a4240e166a36caee00d0"} Apr 22 19:16:48.107174 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.106714 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:16:48.134162 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.134108 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" podStartSLOduration=1.410778319 podStartE2EDuration="4.13408983s" podCreationTimestamp="2026-04-22 19:16:44 +0000 UTC" firstStartedPulling="2026-04-22 19:16:44.928349924 +0000 UTC m=+613.293764738" lastFinishedPulling="2026-04-22 19:16:47.651661422 +0000 UTC m=+616.017076249" observedRunningTime="2026-04-22 19:16:48.131804562 +0000 UTC m=+616.497219398" watchObservedRunningTime="2026-04-22 19:16:48.13408983 +0000 UTC m=+616.499504669" Apr 22 19:16:48.827491 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.827461 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx"] Apr 22 19:16:48.830913 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.830893 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.833538 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.833498 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 19:16:48.833538 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.833508 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 19:16:48.833877 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.833859 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 19:16:48.834115 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.834095 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 19:16:48.834188 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.834101 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-9g9mz\"" Apr 22 19:16:48.844500 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.844479 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx"] Apr 22 19:16:48.904990 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.904951 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cd90129c-8b33-489e-b329-a40428948fc3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.905125 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.905007 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cd90129c-8b33-489e-b329-a40428948fc3-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.905125 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.905055 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.905253 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.905137 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7bsf\" (UniqueName: \"kubernetes.io/projected/cd90129c-8b33-489e-b329-a40428948fc3-kube-api-access-q7bsf\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.905307 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.905259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.905360 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.905320 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cd90129c-8b33-489e-b329-a40428948fc3-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:48.905360 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:48.905340 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006200 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006160 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cd90129c-8b33-489e-b329-a40428948fc3-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006200 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cd90129c-8b33-489e-b329-a40428948fc3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006291 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cd90129c-8b33-489e-b329-a40428948fc3-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006431 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006388 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7bsf\" (UniqueName: \"kubernetes.io/projected/cd90129c-8b33-489e-b329-a40428948fc3-kube-api-access-q7bsf\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.006651 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006465 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.007003 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.006972 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/cd90129c-8b33-489e-b329-a40428948fc3-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.009016 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.008984 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/cd90129c-8b33-489e-b329-a40428948fc3-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.009120 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.009051 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.009202 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.009182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.009276 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.009261 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/cd90129c-8b33-489e-b329-a40428948fc3-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.015491 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.015466 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cd90129c-8b33-489e-b329-a40428948fc3-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.015935 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.015911 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7bsf\" (UniqueName: \"kubernetes.io/projected/cd90129c-8b33-489e-b329-a40428948fc3-kube-api-access-q7bsf\") pod \"istiod-openshift-gateway-55ff986f96-m8qrx\" (UID: \"cd90129c-8b33-489e-b329-a40428948fc3\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.140957 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.140870 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:49.277139 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:49.277109 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx"] Apr 22 19:16:49.279668 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:16:49.279623 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd90129c_8b33_489e_b329_a40428948fc3.slice/crio-66854082db26c9930624e6412d5e5eaa8b3b8984d9ebe0394677d0fd3afbe4cd WatchSource:0}: Error finding container 66854082db26c9930624e6412d5e5eaa8b3b8984d9ebe0394677d0fd3afbe4cd: Status 404 returned error can't find the container with id 66854082db26c9930624e6412d5e5eaa8b3b8984d9ebe0394677d0fd3afbe4cd Apr 22 19:16:50.115882 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:50.115838 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" event={"ID":"cd90129c-8b33-489e-b329-a40428948fc3","Type":"ContainerStarted","Data":"66854082db26c9930624e6412d5e5eaa8b3b8984d9ebe0394677d0fd3afbe4cd"} Apr 22 19:16:51.866858 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:51.866801 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:16:51.867235 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:51.866901 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 22 19:16:52.127008 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:52.126912 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" event={"ID":"cd90129c-8b33-489e-b329-a40428948fc3","Type":"ContainerStarted","Data":"687b7aa5b095c43d64e74dd1dce1f1c574679390e6258754e3994fb0ffb3ab57"} Apr 22 19:16:52.127189 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:52.127060 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:52.128928 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:52.128898 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-m8qrx container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 19:16:52.129051 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:52.128953 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" podUID="cd90129c-8b33-489e-b329-a40428948fc3" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:52.150613 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:52.150570 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" podStartSLOduration=1.566345541 podStartE2EDuration="4.150557835s" podCreationTimestamp="2026-04-22 19:16:48 +0000 UTC" firstStartedPulling="2026-04-22 19:16:49.282324371 +0000 UTC m=+617.647739199" lastFinishedPulling="2026-04-22 19:16:51.866536678 +0000 UTC m=+620.231951493" observedRunningTime="2026-04-22 19:16:52.149773698 +0000 UTC m=+620.515188533" watchObservedRunningTime="2026-04-22 19:16:52.150557835 +0000 UTC m=+620.515972673" Apr 22 19:16:53.132114 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:53.132079 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-m8qrx container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 19:16:53.132578 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:53.132137 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" podUID="cd90129c-8b33-489e-b329-a40428948fc3" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:16:56.132446 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:56.132413 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-m8qrx" Apr 22 19:16:59.113495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:16:59.113463 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-qglff" Apr 22 19:17:10.080113 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:17:10.080081 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-7x9f9" Apr 22 19:18:01.349303 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.349271 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-27nqd"] Apr 22 19:18:01.351577 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.351548 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:01.357741 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.357705 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 19:18:01.357902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.357745 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 19:18:01.357902 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.357769 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5z9m2\"" Apr 22 19:18:01.369893 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.369867 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-27nqd"] Apr 22 19:18:01.396958 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.396925 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbj85\" (UniqueName: \"kubernetes.io/projected/844b3f8b-b997-41a9-b21d-8925bbf96fd3-kube-api-access-pbj85\") pod \"authorino-operator-657f44b778-27nqd\" (UID: \"844b3f8b-b997-41a9-b21d-8925bbf96fd3\") " pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:01.498077 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.498042 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbj85\" (UniqueName: \"kubernetes.io/projected/844b3f8b-b997-41a9-b21d-8925bbf96fd3-kube-api-access-pbj85\") pod \"authorino-operator-657f44b778-27nqd\" (UID: \"844b3f8b-b997-41a9-b21d-8925bbf96fd3\") " pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:01.520382 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.520348 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbj85\" (UniqueName: \"kubernetes.io/projected/844b3f8b-b997-41a9-b21d-8925bbf96fd3-kube-api-access-pbj85\") pod \"authorino-operator-657f44b778-27nqd\" (UID: \"844b3f8b-b997-41a9-b21d-8925bbf96fd3\") " pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:01.661831 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.661718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:01.797101 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:01.797072 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-27nqd"] Apr 22 19:18:01.798693 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:18:01.798662 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod844b3f8b_b997_41a9_b21d_8925bbf96fd3.slice/crio-db818419ee24e621567217835ea93a040c4671b07bd0a5e0db3305193651656a WatchSource:0}: Error finding container db818419ee24e621567217835ea93a040c4671b07bd0a5e0db3305193651656a: Status 404 returned error can't find the container with id db818419ee24e621567217835ea93a040c4671b07bd0a5e0db3305193651656a Apr 22 19:18:02.369205 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:02.369170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" event={"ID":"844b3f8b-b997-41a9-b21d-8925bbf96fd3","Type":"ContainerStarted","Data":"db818419ee24e621567217835ea93a040c4671b07bd0a5e0db3305193651656a"} Apr 22 19:18:04.377051 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:04.377008 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" event={"ID":"844b3f8b-b997-41a9-b21d-8925bbf96fd3","Type":"ContainerStarted","Data":"e402d15cbbd2b2d6073659b131334328bce164ff5622ce9bb79b0f7c19cadf6d"} Apr 22 19:18:04.377455 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:04.377126 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:04.407877 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:04.407833 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" podStartSLOduration=1.844151202 podStartE2EDuration="3.407820094s" podCreationTimestamp="2026-04-22 19:18:01 +0000 UTC" firstStartedPulling="2026-04-22 19:18:01.80069981 +0000 UTC m=+690.166114625" lastFinishedPulling="2026-04-22 19:18:03.3643687 +0000 UTC m=+691.729783517" observedRunningTime="2026-04-22 19:18:04.405530523 +0000 UTC m=+692.770945359" watchObservedRunningTime="2026-04-22 19:18:04.407820094 +0000 UTC m=+692.773234929" Apr 22 19:18:15.383131 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:15.383101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-27nqd" Apr 22 19:18:29.425019 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.424923 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9"] Apr 22 19:18:29.428553 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.428534 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.431624 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.431603 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-2f9z4\"" Apr 22 19:18:29.442382 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.442358 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9"] Apr 22 19:18:29.549464 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.549429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n9nl\" (UniqueName: \"kubernetes.io/projected/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-kube-api-access-9n9nl\") pod \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.549464 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.549465 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.563923 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.563882 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9"] Apr 22 19:18:29.564155 ip-10-0-134-22 kubenswrapper[2569]: E0422 19:18:29.564136 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-9n9nl], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" podUID="9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" Apr 22 19:18:29.650113 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.650078 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.650282 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.650190 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n9nl\" (UniqueName: \"kubernetes.io/projected/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-kube-api-access-9n9nl\") pod \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.650466 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.650445 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.669793 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.669747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n9nl\" (UniqueName: \"kubernetes.io/projected/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-kube-api-access-9n9nl\") pod \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:29.694531 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.694445 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9"] Apr 22 19:18:29.705716 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.705689 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9"] Apr 22 19:18:29.753475 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.753436 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf"] Apr 22 19:18:29.757064 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.757043 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:18:29.760539 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.760516 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-wsn9b\"" Apr 22 19:18:29.799710 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.799676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf"] Apr 22 19:18:29.852293 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.852259 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84f2d\" (UniqueName: \"kubernetes.io/projected/ed77e126-6727-4b0b-9ab8-502105848c89-kube-api-access-84f2d\") pod \"limitador-operator-controller-manager-85c4996f8c-5wbpf\" (UID: \"ed77e126-6727-4b0b-9ab8-502105848c89\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:18:29.953378 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.953286 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84f2d\" (UniqueName: \"kubernetes.io/projected/ed77e126-6727-4b0b-9ab8-502105848c89-kube-api-access-84f2d\") pod \"limitador-operator-controller-manager-85c4996f8c-5wbpf\" (UID: \"ed77e126-6727-4b0b-9ab8-502105848c89\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:18:29.969921 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:29.969884 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84f2d\" (UniqueName: \"kubernetes.io/projected/ed77e126-6727-4b0b-9ab8-502105848c89-kube-api-access-84f2d\") pod \"limitador-operator-controller-manager-85c4996f8c-5wbpf\" (UID: \"ed77e126-6727-4b0b-9ab8-502105848c89\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:18:30.068042 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.068002 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:18:30.200219 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.200184 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf"] Apr 22 19:18:30.203587 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:18:30.203541 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded77e126_6727_4b0b_9ab8_502105848c89.slice/crio-a9c6e79abe76c9826a828a7b11f948e6b4c14a1227ceee4188b051488784718f WatchSource:0}: Error finding container a9c6e79abe76c9826a828a7b11f948e6b4c14a1227ceee4188b051488784718f: Status 404 returned error can't find the container with id a9c6e79abe76c9826a828a7b11f948e6b4c14a1227ceee4188b051488784718f Apr 22 19:18:30.477434 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.477354 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:30.477892 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.477350 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" event={"ID":"ed77e126-6727-4b0b-9ab8-502105848c89","Type":"ContainerStarted","Data":"a9c6e79abe76c9826a828a7b11f948e6b4c14a1227ceee4188b051488784718f"} Apr 22 19:18:30.482136 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.482117 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:30.485606 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.485581 2569 status_manager.go:895] "Failed to get status for pod" podUID="9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" err="pods \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:18:30.559240 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.559208 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-extensions-socket-volume\") pod \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " Apr 22 19:18:30.559410 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.559277 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n9nl\" (UniqueName: \"kubernetes.io/projected/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-kube-api-access-9n9nl\") pod \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\" (UID: \"9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea\") " Apr 22 19:18:30.559540 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.559507 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" (UID: "9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:18:30.561430 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.561407 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-kube-api-access-9n9nl" (OuterVolumeSpecName: "kube-api-access-9n9nl") pod "9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" (UID: "9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea"). InnerVolumeSpecName "kube-api-access-9n9nl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:18:30.660590 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.660556 2569 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-extensions-socket-volume\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:18:30.660590 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:30.660591 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9n9nl\" (UniqueName: \"kubernetes.io/projected/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea-kube-api-access-9n9nl\") on node \"ip-10-0-134-22.ec2.internal\" DevicePath \"\"" Apr 22 19:18:31.481409 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:31.481379 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" Apr 22 19:18:31.484441 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:31.484401 2569 status_manager.go:895] "Failed to get status for pod" podUID="9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" err="pods \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:18:31.491929 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:31.491893 2569 status_manager.go:895] "Failed to get status for pod" podUID="9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" err="pods \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:18:32.223959 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:32.223928 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" path="/var/lib/kubelet/pods/9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea/volumes" Apr 22 19:18:32.271295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:32.271246 2569 status_manager.go:895] "Failed to get status for pod" podUID="9f8fc39f-89d5-4f9a-b55e-53d4e8f19eea" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sp6b9" err="pods \"kuadrant-operator-controller-manager-84b657d985-sp6b9\" is forbidden: User \"system:node:ip-10-0-134-22.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-134-22.ec2.internal' and this object" Apr 22 19:18:32.489457 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:32.489375 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" event={"ID":"ed77e126-6727-4b0b-9ab8-502105848c89","Type":"ContainerStarted","Data":"c21cce3cd7032efce1f4f356fb10f60d109e4f4302e16339358a7e968814b920"} Apr 22 19:18:32.489841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:32.489503 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:18:32.509114 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:32.509066 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" podStartSLOduration=1.8688770780000001 podStartE2EDuration="3.509050385s" podCreationTimestamp="2026-04-22 19:18:29 +0000 UTC" firstStartedPulling="2026-04-22 19:18:30.205916957 +0000 UTC m=+718.571331771" lastFinishedPulling="2026-04-22 19:18:31.846090264 +0000 UTC m=+720.211505078" observedRunningTime="2026-04-22 19:18:32.508008841 +0000 UTC m=+720.873423676" watchObservedRunningTime="2026-04-22 19:18:32.509050385 +0000 UTC m=+720.874465220" Apr 22 19:18:43.495538 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:18:43.495509 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-5wbpf" Apr 22 19:19:18.927155 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:18.927119 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-mmxb8"] Apr 22 19:19:18.930809 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:18.930785 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:18.933343 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:18.933322 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 19:19:18.933542 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:18.933530 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-2ghc7\"" Apr 22 19:19:18.939444 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:18.939424 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-mmxb8"] Apr 22 19:19:19.078251 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.078220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/62925aeb-9232-46c4-a230-26c3a1b8304b-data\") pod \"postgres-868db5846d-mmxb8\" (UID: \"62925aeb-9232-46c4-a230-26c3a1b8304b\") " pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.078394 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.078257 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ffq\" (UniqueName: \"kubernetes.io/projected/62925aeb-9232-46c4-a230-26c3a1b8304b-kube-api-access-v8ffq\") pod \"postgres-868db5846d-mmxb8\" (UID: \"62925aeb-9232-46c4-a230-26c3a1b8304b\") " pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.179569 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.179502 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/62925aeb-9232-46c4-a230-26c3a1b8304b-data\") pod \"postgres-868db5846d-mmxb8\" (UID: \"62925aeb-9232-46c4-a230-26c3a1b8304b\") " pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.179569 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.179536 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ffq\" (UniqueName: \"kubernetes.io/projected/62925aeb-9232-46c4-a230-26c3a1b8304b-kube-api-access-v8ffq\") pod \"postgres-868db5846d-mmxb8\" (UID: \"62925aeb-9232-46c4-a230-26c3a1b8304b\") " pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.179915 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.179888 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/62925aeb-9232-46c4-a230-26c3a1b8304b-data\") pod \"postgres-868db5846d-mmxb8\" (UID: \"62925aeb-9232-46c4-a230-26c3a1b8304b\") " pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.188986 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.188961 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ffq\" (UniqueName: \"kubernetes.io/projected/62925aeb-9232-46c4-a230-26c3a1b8304b-kube-api-access-v8ffq\") pod \"postgres-868db5846d-mmxb8\" (UID: \"62925aeb-9232-46c4-a230-26c3a1b8304b\") " pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.244295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.244276 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:19.389090 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.389063 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-mmxb8"] Apr 22 19:19:19.390671 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:19:19.390646 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62925aeb_9232_46c4_a230_26c3a1b8304b.slice/crio-7a8404757813aef7f68469439343b779a0b0f2c9c76d689c0a1343fab22d6109 WatchSource:0}: Error finding container 7a8404757813aef7f68469439343b779a0b0f2c9c76d689c0a1343fab22d6109: Status 404 returned error can't find the container with id 7a8404757813aef7f68469439343b779a0b0f2c9c76d689c0a1343fab22d6109 Apr 22 19:19:19.655297 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:19.655217 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-mmxb8" event={"ID":"62925aeb-9232-46c4-a230-26c3a1b8304b","Type":"ContainerStarted","Data":"7a8404757813aef7f68469439343b779a0b0f2c9c76d689c0a1343fab22d6109"} Apr 22 19:19:24.678286 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:24.678252 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-mmxb8" event={"ID":"62925aeb-9232-46c4-a230-26c3a1b8304b","Type":"ContainerStarted","Data":"dbb57fccd1e879690df78563c4ed6e617a7db1a749e9dc38313b83b3e397c9f2"} Apr 22 19:19:24.678660 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:24.678433 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:19:24.695840 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:24.695795 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-mmxb8" podStartSLOduration=1.626028367 podStartE2EDuration="6.695781765s" podCreationTimestamp="2026-04-22 19:19:18 +0000 UTC" firstStartedPulling="2026-04-22 19:19:19.391974666 +0000 UTC m=+767.757389479" lastFinishedPulling="2026-04-22 19:19:24.461728063 +0000 UTC m=+772.827142877" observedRunningTime="2026-04-22 19:19:24.693968708 +0000 UTC m=+773.059383545" watchObservedRunningTime="2026-04-22 19:19:24.695781765 +0000 UTC m=+773.061196598" Apr 22 19:19:30.710713 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:19:30.710687 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-mmxb8" Apr 22 19:20:17.429408 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.429373 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q"] Apr 22 19:20:17.433050 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.433033 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.438102 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.438081 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 19:20:17.439280 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.439257 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 19:20:17.439280 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.439267 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 19:20:17.439444 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.439269 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-rhw56\"" Apr 22 19:20:17.462692 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.462658 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q"] Apr 22 19:20:17.551719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.551690 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.551879 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.551733 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dce8e2c-e83e-4215-b299-e6d542772166-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.551879 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.551779 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.551879 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.551843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.552034 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.551903 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.552034 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.551926 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6ft4\" (UniqueName: \"kubernetes.io/projected/0dce8e2c-e83e-4215-b299-e6d542772166-kube-api-access-t6ft4\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653004 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.652974 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653131 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653219 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653146 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653219 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6ft4\" (UniqueName: \"kubernetes.io/projected/0dce8e2c-e83e-4215-b299-e6d542772166-kube-api-access-t6ft4\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653297 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653297 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653274 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dce8e2c-e83e-4215-b299-e6d542772166-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653394 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653325 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653505 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653486 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.653578 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.653556 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.655601 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.655579 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0dce8e2c-e83e-4215-b299-e6d542772166-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.656035 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.656015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0dce8e2c-e83e-4215-b299-e6d542772166-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.663665 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.663637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6ft4\" (UniqueName: \"kubernetes.io/projected/0dce8e2c-e83e-4215-b299-e6d542772166-kube-api-access-t6ft4\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-w5w6q\" (UID: \"0dce8e2c-e83e-4215-b299-e6d542772166\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.743723 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.743662 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:17.883257 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.883222 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q"] Apr 22 19:20:17.885605 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:20:17.885580 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dce8e2c_e83e_4215_b299_e6d542772166.slice/crio-7a3ac228b32bba5edf2d7c5edc29fae6259cefc063adefd06260dc4e339191aa WatchSource:0}: Error finding container 7a3ac228b32bba5edf2d7c5edc29fae6259cefc063adefd06260dc4e339191aa: Status 404 returned error can't find the container with id 7a3ac228b32bba5edf2d7c5edc29fae6259cefc063adefd06260dc4e339191aa Apr 22 19:20:17.887318 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:17.887299 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:20:18.871773 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:18.871724 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" event={"ID":"0dce8e2c-e83e-4215-b299-e6d542772166","Type":"ContainerStarted","Data":"7a3ac228b32bba5edf2d7c5edc29fae6259cefc063adefd06260dc4e339191aa"} Apr 22 19:20:23.891913 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:23.891870 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" event={"ID":"0dce8e2c-e83e-4215-b299-e6d542772166","Type":"ContainerStarted","Data":"9583d77a4ed38b859d74ef5f6604435adaca556a8a4294402d91e57e704aa090"} Apr 22 19:20:28.912224 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:28.912192 2569 generic.go:358] "Generic (PLEG): container finished" podID="0dce8e2c-e83e-4215-b299-e6d542772166" containerID="9583d77a4ed38b859d74ef5f6604435adaca556a8a4294402d91e57e704aa090" exitCode=0 Apr 22 19:20:28.912676 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:28.912269 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" event={"ID":"0dce8e2c-e83e-4215-b299-e6d542772166","Type":"ContainerDied","Data":"9583d77a4ed38b859d74ef5f6604435adaca556a8a4294402d91e57e704aa090"} Apr 22 19:20:30.920703 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:30.920671 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" event={"ID":"0dce8e2c-e83e-4215-b299-e6d542772166","Type":"ContainerStarted","Data":"0309a094fc15a544b13a6c82350565e2acc83a4134efed5ba4fa4f2014d522ce"} Apr 22 19:20:30.921166 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:30.920901 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:39.311241 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.311184 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" podStartSLOduration=10.20570614 podStartE2EDuration="22.311168022s" podCreationTimestamp="2026-04-22 19:20:17 +0000 UTC" firstStartedPulling="2026-04-22 19:20:17.887426738 +0000 UTC m=+826.252841553" lastFinishedPulling="2026-04-22 19:20:29.992888608 +0000 UTC m=+838.358303435" observedRunningTime="2026-04-22 19:20:30.95398658 +0000 UTC m=+839.319401415" watchObservedRunningTime="2026-04-22 19:20:39.311168022 +0000 UTC m=+847.676582856" Apr 22 19:20:39.311943 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.311918 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z"] Apr 22 19:20:39.415940 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.415906 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z"] Apr 22 19:20:39.416084 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.416032 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.419086 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.419061 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 22 19:20:39.464266 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.464133 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.464266 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.464176 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckmxx\" (UniqueName: \"kubernetes.io/projected/0bad5ce6-1f7f-4494-a155-1f9841577bd8-kube-api-access-ckmxx\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.464827 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.464524 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.464827 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.464603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bad5ce6-1f7f-4494-a155-1f9841577bd8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.464827 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.464657 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.464827 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.464686 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.565961 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.565887 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bad5ce6-1f7f-4494-a155-1f9841577bd8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.565961 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.565948 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566156 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.565982 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566156 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.566023 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566156 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.566047 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckmxx\" (UniqueName: \"kubernetes.io/projected/0bad5ce6-1f7f-4494-a155-1f9841577bd8-kube-api-access-ckmxx\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566156 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.566117 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566345 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.566310 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566345 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.566339 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.566482 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.566461 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.568403 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.568380 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/0bad5ce6-1f7f-4494-a155-1f9841577bd8-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.568529 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.568513 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0bad5ce6-1f7f-4494-a155-1f9841577bd8-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.583060 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.583034 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckmxx\" (UniqueName: \"kubernetes.io/projected/0bad5ce6-1f7f-4494-a155-1f9841577bd8-kube-api-access-ckmxx\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-fs85z\" (UID: \"0bad5ce6-1f7f-4494-a155-1f9841577bd8\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.725952 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.725922 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:39.856774 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.856636 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z"] Apr 22 19:20:39.862392 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:20:39.862350 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bad5ce6_1f7f_4494_a155_1f9841577bd8.slice/crio-0cda7b521c21719ccdaa5fa4832080a084577b99196e0c2bc251035ff35c7477 WatchSource:0}: Error finding container 0cda7b521c21719ccdaa5fa4832080a084577b99196e0c2bc251035ff35c7477: Status 404 returned error can't find the container with id 0cda7b521c21719ccdaa5fa4832080a084577b99196e0c2bc251035ff35c7477 Apr 22 19:20:39.951138 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:39.951108 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" event={"ID":"0bad5ce6-1f7f-4494-a155-1f9841577bd8","Type":"ContainerStarted","Data":"0cda7b521c21719ccdaa5fa4832080a084577b99196e0c2bc251035ff35c7477"} Apr 22 19:20:40.956296 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:40.956257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" event={"ID":"0bad5ce6-1f7f-4494-a155-1f9841577bd8","Type":"ContainerStarted","Data":"24ddbf72aaebd55e3195f7c75ffa6a4fd617a19deec7ce226e6a925720a609ae"} Apr 22 19:20:41.937285 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:41.937255 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-w5w6q" Apr 22 19:20:48.991725 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:48.991689 2569 generic.go:358] "Generic (PLEG): container finished" podID="0bad5ce6-1f7f-4494-a155-1f9841577bd8" containerID="24ddbf72aaebd55e3195f7c75ffa6a4fd617a19deec7ce226e6a925720a609ae" exitCode=0 Apr 22 19:20:48.992195 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:48.991746 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" event={"ID":"0bad5ce6-1f7f-4494-a155-1f9841577bd8","Type":"ContainerDied","Data":"24ddbf72aaebd55e3195f7c75ffa6a4fd617a19deec7ce226e6a925720a609ae"} Apr 22 19:20:49.996486 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:49.996449 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" event={"ID":"0bad5ce6-1f7f-4494-a155-1f9841577bd8","Type":"ContainerStarted","Data":"2b9384d893b4d559f5f2db0caaafb9de27130aa11994ab920bf3736c8398bd90"} Apr 22 19:20:49.996926 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:49.996678 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:20:50.025616 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:20:50.025563 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" podStartSLOduration=10.686863515 podStartE2EDuration="11.02554891s" podCreationTimestamp="2026-04-22 19:20:39 +0000 UTC" firstStartedPulling="2026-04-22 19:20:48.992454671 +0000 UTC m=+857.357869484" lastFinishedPulling="2026-04-22 19:20:49.331140056 +0000 UTC m=+857.696554879" observedRunningTime="2026-04-22 19:20:50.0233823 +0000 UTC m=+858.388797135" watchObservedRunningTime="2026-04-22 19:20:50.02554891 +0000 UTC m=+858.390963744" Apr 22 19:21:01.014453 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:21:01.014422 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-fs85z" Apr 22 19:21:32.159416 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:21:32.159339 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:21:32.159416 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:21:32.159379 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:21:32.166277 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:21:32.166256 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:21:32.166412 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:21:32.166289 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:23:46.655101 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:46.655045 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7x9f9_63e9696d-09aa-4774-b2f1-5b73cf9ab409/manager/0.log" Apr 22 19:23:46.982126 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:46.982099 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-nlv9w_5ab59fdb-1873-4f91-8414-d788d9fec057/manager/1.log" Apr 22 19:23:47.333847 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:47.333776 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c9fd8c974-zbbkh_d246c5f2-f55d-478c-84f5-ee7646639611/manager/0.log" Apr 22 19:23:47.435552 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:47.435523 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-mmxb8_62925aeb-9232-46c4-a230-26c3a1b8304b/postgres/0.log" Apr 22 19:23:48.757696 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:48.757665 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-27nqd_844b3f8b-b997-41a9-b21d-8925bbf96fd3/manager/0.log" Apr 22 19:23:49.392593 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:49.392556 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-5wbpf_ed77e126-6727-4b0b-9ab8-502105848c89/manager/0.log" Apr 22 19:23:49.845416 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:49.845385 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-m8qrx_cd90129c-8b33-489e-b329-a40428948fc3/discovery/0.log" Apr 22 19:23:49.952478 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:49.952451 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-77597c7855-76svf_1d8494de-a95f-42d3-9a4d-dac04e26d2b9/kube-auth-proxy/0.log" Apr 22 19:23:50.796711 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:50.796684 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-w5w6q_0dce8e2c-e83e-4215-b299-e6d542772166/storage-initializer/0.log" Apr 22 19:23:50.803115 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:50.803094 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-w5w6q_0dce8e2c-e83e-4215-b299-e6d542772166/main/0.log" Apr 22 19:23:51.125022 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:51.124955 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-fs85z_0bad5ce6-1f7f-4494-a155-1f9841577bd8/storage-initializer/0.log" Apr 22 19:23:51.131600 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:51.131575 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-fs85z_0bad5ce6-1f7f-4494-a155-1f9841577bd8/main/0.log" Apr 22 19:23:58.211285 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:58.211255 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7r27j_ed711052-aeb3-4d51-9f5f-54731300179f/global-pull-secret-syncer/0.log" Apr 22 19:23:58.362548 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:58.362519 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xjfbh_ccd0ced6-c30a-4fe6-8fc3-356740ce7c61/konnectivity-agent/0.log" Apr 22 19:23:58.431447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:23:58.431418 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-22.ec2.internal_1b90ee820fd4186f1e6cd40d24ef3276/haproxy/0.log" Apr 22 19:24:02.709764 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:02.709720 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-27nqd_844b3f8b-b997-41a9-b21d-8925bbf96fd3/manager/0.log" Apr 22 19:24:02.895471 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:02.895440 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-5wbpf_ed77e126-6727-4b0b-9ab8-502105848c89/manager/0.log" Apr 22 19:24:04.244792 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.244740 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/alertmanager/0.log" Apr 22 19:24:04.268404 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.268378 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/config-reloader/0.log" Apr 22 19:24:04.293344 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.293320 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/kube-rbac-proxy-web/0.log" Apr 22 19:24:04.319870 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.319845 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/kube-rbac-proxy/0.log" Apr 22 19:24:04.346398 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.346373 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/kube-rbac-proxy-metric/0.log" Apr 22 19:24:04.372977 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.372958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/prom-label-proxy/0.log" Apr 22 19:24:04.404486 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.404462 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e2e95f49-0d7b-48c1-97ea-b0a519b5248c/init-config-reloader/0.log" Apr 22 19:24:04.897409 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.897379 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zr94k_dd8a63da-f501-4aa6-b5a6-6fa86c970f57/node-exporter/0.log" Apr 22 19:24:04.918671 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.918648 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zr94k_dd8a63da-f501-4aa6-b5a6-6fa86c970f57/kube-rbac-proxy/0.log" Apr 22 19:24:04.947126 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:04.947104 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zr94k_dd8a63da-f501-4aa6-b5a6-6fa86c970f57/init-textfile/0.log" Apr 22 19:24:05.230031 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.230002 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-84qx5_139b6a93-7c35-47b0-a302-c247308fb15a/prometheus-operator/0.log" Apr 22 19:24:05.250240 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.250220 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-84qx5_139b6a93-7c35-47b0-a302-c247308fb15a/kube-rbac-proxy/0.log" Apr 22 19:24:05.390391 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.390361 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6897866f4b-mhjxd_900eddfd-5ec6-4ec0-93ed-4fd24fe6326c/thanos-query/0.log" Apr 22 19:24:05.413115 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.413089 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6897866f4b-mhjxd_900eddfd-5ec6-4ec0-93ed-4fd24fe6326c/kube-rbac-proxy-web/0.log" Apr 22 19:24:05.439488 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.439464 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6897866f4b-mhjxd_900eddfd-5ec6-4ec0-93ed-4fd24fe6326c/kube-rbac-proxy/0.log" Apr 22 19:24:05.459588 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.459565 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6897866f4b-mhjxd_900eddfd-5ec6-4ec0-93ed-4fd24fe6326c/prom-label-proxy/0.log" Apr 22 19:24:05.481232 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.481174 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6897866f4b-mhjxd_900eddfd-5ec6-4ec0-93ed-4fd24fe6326c/kube-rbac-proxy-rules/0.log" Apr 22 19:24:05.502364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:05.502340 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6897866f4b-mhjxd_900eddfd-5ec6-4ec0-93ed-4fd24fe6326c/kube-rbac-proxy-metrics/0.log" Apr 22 19:24:06.449843 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.449808 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj"] Apr 22 19:24:06.453173 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.453157 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.455523 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.455501 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tdl99\"/\"kube-root-ca.crt\"" Apr 22 19:24:06.456445 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.456423 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tdl99\"/\"openshift-service-ca.crt\"" Apr 22 19:24:06.456527 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.456435 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tdl99\"/\"default-dockercfg-ncqxp\"" Apr 22 19:24:06.460098 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.459909 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj"] Apr 22 19:24:06.539342 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.539315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-lib-modules\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.539495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.539355 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5f6p\" (UniqueName: \"kubernetes.io/projected/0e770361-a306-4390-b806-e5eb678e153c-kube-api-access-m5f6p\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.539495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.539426 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-sys\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.539495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.539464 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-podres\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.539495 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.539489 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-proc\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640150 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640113 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-lib-modules\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5f6p\" (UniqueName: \"kubernetes.io/projected/0e770361-a306-4390-b806-e5eb678e153c-kube-api-access-m5f6p\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640198 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-sys\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-podres\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640295 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-proc\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640299 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-sys\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640300 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-lib-modules\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640323 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-proc\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.640447 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.640363 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0e770361-a306-4390-b806-e5eb678e153c-podres\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.647882 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.647863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5f6p\" (UniqueName: \"kubernetes.io/projected/0e770361-a306-4390-b806-e5eb678e153c-kube-api-access-m5f6p\") pod \"perf-node-gather-daemonset-dgbcj\" (UID: \"0e770361-a306-4390-b806-e5eb678e153c\") " pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.763216 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.763160 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:06.896258 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:06.896228 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj"] Apr 22 19:24:06.897910 ip-10-0-134-22 kubenswrapper[2569]: W0422 19:24:06.897888 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e770361_a306_4390_b806_e5eb678e153c.slice/crio-e6c90f02cb9facf67e077bfe54384b3d695476bf142931eea8252e69ec2056ba WatchSource:0}: Error finding container e6c90f02cb9facf67e077bfe54384b3d695476bf142931eea8252e69ec2056ba: Status 404 returned error can't find the container with id e6c90f02cb9facf67e077bfe54384b3d695476bf142931eea8252e69ec2056ba Apr 22 19:24:07.061455 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.061402 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/2.log" Apr 22 19:24:07.064990 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.064958 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fl4n2_c9d5594b-5dc2-461d-bd58-496386ced33b/console-operator/3.log" Apr 22 19:24:07.487990 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.487960 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cf87df8dd-wwg7f_bc45760b-8746-44c5-8eed-405319afcc44/console/0.log" Apr 22 19:24:07.526396 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.526372 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-xjm45_2639e77b-b608-4425-a733-a7915361daa5/download-server/0.log" Apr 22 19:24:07.706113 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.706085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" event={"ID":"0e770361-a306-4390-b806-e5eb678e153c","Type":"ContainerStarted","Data":"0154d9f628fce194539c388ff672d7e6c75fe10074153dd2b8d19fbe5c30e759"} Apr 22 19:24:07.706113 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.706118 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" event={"ID":"0e770361-a306-4390-b806-e5eb678e153c","Type":"ContainerStarted","Data":"e6c90f02cb9facf67e077bfe54384b3d695476bf142931eea8252e69ec2056ba"} Apr 22 19:24:07.706300 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.706218 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:07.722364 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:07.722321 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" podStartSLOduration=1.722309855 podStartE2EDuration="1.722309855s" podCreationTimestamp="2026-04-22 19:24:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:07.721599054 +0000 UTC m=+1056.087013894" watchObservedRunningTime="2026-04-22 19:24:07.722309855 +0000 UTC m=+1056.087724689" Apr 22 19:24:08.849209 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:08.849181 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lw7xs_86f8ea02-993b-4c12-b611-355d4b6cd91c/dns/0.log" Apr 22 19:24:08.869818 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:08.869785 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lw7xs_86f8ea02-993b-4c12-b611-355d4b6cd91c/kube-rbac-proxy/0.log" Apr 22 19:24:08.932720 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:08.932696 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2wjct_2f15bc14-85c1-4370-8e8c-dfc474a5636b/dns-node-resolver/0.log" Apr 22 19:24:09.411841 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:09.411812 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d6f7d67b8-qg2xj_a53b0aba-7b03-4959-8f7f-085132bd83fa/registry/0.log" Apr 22 19:24:09.433791 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:09.433771 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hdwcg_8ab2b075-d3d5-4d3a-848e-89344c4f11b6/node-ca/0.log" Apr 22 19:24:10.322231 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:10.322198 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-m8qrx_cd90129c-8b33-489e-b329-a40428948fc3/discovery/0.log" Apr 22 19:24:10.343914 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:10.343874 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-77597c7855-76svf_1d8494de-a95f-42d3-9a4d-dac04e26d2b9/kube-auth-proxy/0.log" Apr 22 19:24:10.989043 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:10.989018 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-b5ggr_21e125c2-4036-4304-91d4-c0370710d4af/serve-healthcheck-canary/0.log" Apr 22 19:24:11.420977 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:11.420954 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w489z_0886168b-fb42-4ca3-81f5-2dabb41537e9/insights-operator/1.log" Apr 22 19:24:11.426349 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:11.426330 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w489z_0886168b-fb42-4ca3-81f5-2dabb41537e9/insights-operator/0.log" Apr 22 19:24:11.520196 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:11.520169 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mj44z_926eba52-2ec6-43ad-9b90-958efbe70d95/kube-rbac-proxy/0.log" Apr 22 19:24:11.550327 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:11.550299 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mj44z_926eba52-2ec6-43ad-9b90-958efbe70d95/exporter/0.log" Apr 22 19:24:11.590610 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:11.590591 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-mj44z_926eba52-2ec6-43ad-9b90-958efbe70d95/extractor/0.log" Apr 22 19:24:13.488300 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:13.488257 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-7x9f9_63e9696d-09aa-4774-b2f1-5b73cf9ab409/manager/0.log" Apr 22 19:24:13.560097 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:13.560053 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-nlv9w_5ab59fdb-1873-4f91-8414-d788d9fec057/manager/0.log" Apr 22 19:24:13.569367 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:13.569332 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-nlv9w_5ab59fdb-1873-4f91-8414-d788d9fec057/manager/1.log" Apr 22 19:24:13.655313 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:13.655285 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6c9fd8c974-zbbkh_d246c5f2-f55d-478c-84f5-ee7646639611/manager/0.log" Apr 22 19:24:13.680911 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:13.680876 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-mmxb8_62925aeb-9232-46c4-a230-26c3a1b8304b/postgres/0.log" Apr 22 19:24:13.719539 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:13.719519 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tdl99/perf-node-gather-daemonset-dgbcj" Apr 22 19:24:19.510312 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:19.510280 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k6rkf_5676ee9e-cd52-496c-a3cc-32f120c108d4/kube-storage-version-migrator-operator/1.log" Apr 22 19:24:19.511498 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:19.511469 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-k6rkf_5676ee9e-cd52-496c-a3cc-32f120c108d4/kube-storage-version-migrator-operator/0.log" Apr 22 19:24:20.785719 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.785647 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/kube-multus-additional-cni-plugins/0.log" Apr 22 19:24:20.806426 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.806400 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/egress-router-binary-copy/0.log" Apr 22 19:24:20.826143 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.826122 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/cni-plugins/0.log" Apr 22 19:24:20.846322 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.846302 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/bond-cni-plugin/0.log" Apr 22 19:24:20.870862 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.870832 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/routeoverride-cni/0.log" Apr 22 19:24:20.892176 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.892153 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/whereabouts-cni-bincopy/0.log" Apr 22 19:24:20.913320 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.913299 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-x7gv6_1e9b0a71-0187-42db-855a-762dfaa227aa/whereabouts-cni/0.log" Apr 22 19:24:20.998326 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:20.998302 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zgnnh_8dd6df0f-e645-41a7-b974-0454616bb56e/kube-multus/0.log" Apr 22 19:24:21.017489 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:21.017467 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2r8qp_46a3468d-b017-471c-a0df-a07b1c183ff4/network-metrics-daemon/0.log" Apr 22 19:24:21.037164 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:21.037115 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2r8qp_46a3468d-b017-471c-a0df-a07b1c183ff4/kube-rbac-proxy/0.log" Apr 22 19:24:22.227962 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.227929 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-controller/0.log" Apr 22 19:24:22.245318 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.245293 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/0.log" Apr 22 19:24:22.249455 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.249438 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovn-acl-logging/1.log" Apr 22 19:24:22.269517 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.269495 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/kube-rbac-proxy-node/0.log" Apr 22 19:24:22.290499 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.290479 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:24:22.313787 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.313749 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/northd/0.log" Apr 22 19:24:22.337498 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.337469 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/nbdb/0.log" Apr 22 19:24:22.364089 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.364067 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/sbdb/0.log" Apr 22 19:24:22.453635 ip-10-0-134-22 kubenswrapper[2569]: I0422 19:24:22.453614 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tfk46_61b5731d-8883-44c4-a6de-2a90288f2d58/ovnkube-controller/0.log"