Apr 16 20:35:20.948177 ip-10-0-134-79 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:35:20.948189 ip-10-0-134-79 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:35:20.948199 ip-10-0-134-79 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:35:20.948505 ip-10-0-134-79 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:35:31.163493 ip-10-0-134-79 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:35:31.163510 ip-10-0-134-79 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d0174668f6b440fab19ee4551953ab15 -- Apr 16 20:37:58.724971 ip-10-0-134-79 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:37:59.171469 ip-10-0-134-79 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:37:59.171469 ip-10-0-134-79 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:37:59.171469 ip-10-0-134-79 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:37:59.171469 ip-10-0-134-79 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:37:59.171469 ip-10-0-134-79 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:37:59.172658 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.172249 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:37:59.178885 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178865 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:59.178885 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178882 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:59.178885 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178886 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:59.178885 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178889 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:59.178885 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178893 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178897 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178900 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178903 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178908 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178910 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178913 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178916 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178919 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178922 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178924 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178927 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178931 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178935 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178938 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178941 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178944 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178954 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178957 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:59.179085 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178960 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178962 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178965 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178967 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178970 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178972 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178975 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178978 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178980 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178983 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178986 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178988 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178991 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178994 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.178996 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179001 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179005 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179008 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179011 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:59.179537 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179013 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179016 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179019 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179021 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179024 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179027 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179029 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179040 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179045 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179048 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179051 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179053 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179056 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179058 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179061 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179064 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179066 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179069 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179071 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179074 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:59.180024 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179077 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179079 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179082 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179085 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179088 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179090 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179093 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179096 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179100 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179103 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179105 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179108 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179110 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179113 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179115 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179118 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179121 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179124 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179127 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:59.180584 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179129 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179132 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179135 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179137 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179143 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179514 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179518 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179521 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179524 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179526 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179529 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179531 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179534 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179536 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179539 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179542 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179545 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179548 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179550 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179553 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:59.181067 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179556 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179560 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179564 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179568 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179571 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179574 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179577 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179580 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179583 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179586 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179589 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179592 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179594 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179597 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179599 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179602 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179604 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179607 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179609 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:59.181554 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179629 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179632 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179635 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179637 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179640 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179642 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179645 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179648 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179650 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179653 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179655 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179658 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179661 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179664 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179668 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179672 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179674 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179677 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179680 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179682 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:59.182041 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179685 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179688 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179690 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179693 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179696 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179698 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179701 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179703 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179706 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179709 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179711 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179714 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179716 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179719 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179721 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179724 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179726 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179729 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179731 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179734 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:59.182529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179738 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179740 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179742 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179745 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179748 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179751 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179754 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179757 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179759 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179762 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179764 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.179767 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181559 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181569 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181576 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181580 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181586 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181590 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181594 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181598 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181602 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:37:59.183064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181605 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181609 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181628 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181633 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181638 2575 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181642 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181645 2575 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181648 2575 flags.go:64] FLAG: --cloud-config="" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181650 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181653 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181659 2575 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181662 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181665 2575 flags.go:64] FLAG: --config-dir="" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181668 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181672 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181681 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181684 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181687 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181690 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181694 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181697 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181699 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181703 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181706 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181710 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:37:59.183585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181713 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181716 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181719 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181722 2575 flags.go:64] FLAG: --enable-server="true" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181725 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181730 2575 flags.go:64] FLAG: --event-burst="100" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181733 2575 flags.go:64] FLAG: --event-qps="50" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181736 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181739 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181742 2575 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181746 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181748 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181751 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181754 2575 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181757 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181760 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181763 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181766 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181769 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181771 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181774 2575 flags.go:64] FLAG: --feature-gates="" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181779 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181782 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181785 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181788 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181791 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:37:59.184235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181794 2575 flags.go:64] FLAG: --help="false" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181797 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181800 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181803 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181806 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181809 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181813 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181816 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181819 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181822 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181825 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181828 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181831 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181834 2575 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181837 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181840 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181843 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181845 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181848 2575 flags.go:64] FLAG: --lock-file="" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181851 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181854 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181857 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181862 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:37:59.184867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181865 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181868 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181870 2575 flags.go:64] FLAG: --logging-format="text" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181873 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181878 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181881 2575 flags.go:64] FLAG: --manifest-url="" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181884 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181888 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181891 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181895 2575 flags.go:64] FLAG: --max-pods="110" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181898 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181901 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181904 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181907 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181910 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181913 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181916 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181923 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181926 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181929 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181933 2575 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181936 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181941 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181944 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:37:59.185430 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181947 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181950 2575 flags.go:64] FLAG: --port="10250" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181953 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181956 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-01a46c8532e836748" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181959 2575 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181962 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181965 2575 flags.go:64] FLAG: --register-node="true" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181968 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181971 2575 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181975 2575 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181978 2575 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181980 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181983 2575 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181987 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181990 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181993 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181995 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.181998 2575 flags.go:64] FLAG: --runonce="false" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182001 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182004 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182007 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182010 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182012 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182016 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182019 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182022 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:37:59.186059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182025 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182027 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182030 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182034 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182037 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182040 2575 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182043 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182048 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182051 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182054 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182058 2575 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182061 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182064 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182067 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182070 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182073 2575 flags.go:64] FLAG: --v="2" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182078 2575 flags.go:64] FLAG: --version="false" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182082 2575 flags.go:64] FLAG: --vmodule="" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182086 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182089 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182182 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182185 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182188 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182191 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:59.186703 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182193 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182196 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182199 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182201 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182204 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182209 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182211 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182214 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182216 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182219 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182222 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182224 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182233 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182236 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182239 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182241 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182243 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182246 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182249 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:59.187278 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182251 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182254 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182257 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182259 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182262 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182265 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182267 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182270 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182272 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182275 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182277 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182280 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182282 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182285 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182288 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182290 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182293 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182295 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182299 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182302 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:59.187808 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182305 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182309 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182313 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182315 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182318 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182320 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182324 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182327 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182329 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182332 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182335 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182337 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182340 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182343 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182345 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182348 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182350 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182352 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182355 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182358 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:59.188288 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182361 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182363 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182366 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182368 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182371 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182374 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182376 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182379 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182381 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182384 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182388 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182391 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182393 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182396 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182399 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182402 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182404 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182407 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182409 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182412 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:59.188796 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182415 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182417 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.182421 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.182427 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.188723 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.188737 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188789 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188793 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188797 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188801 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188805 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188810 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188813 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188815 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188818 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:59.189316 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188821 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188824 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188826 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188829 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188832 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188834 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188837 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188840 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188842 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188845 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188847 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188850 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188853 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188855 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188858 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188860 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188863 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188865 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188868 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188871 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:59.189709 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188875 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188878 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188881 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188883 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188886 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188889 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188891 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188893 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188896 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188899 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188901 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188904 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188906 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188909 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188911 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188914 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188916 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188918 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188922 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:59.190188 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188926 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188929 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188931 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188934 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188937 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188940 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188943 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188946 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188948 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188951 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188954 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188957 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188960 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188962 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188965 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188967 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188970 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188973 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188975 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:59.190664 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188977 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188980 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188982 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188985 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188987 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188990 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188992 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188995 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.188997 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189000 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189002 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189004 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189007 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189009 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189012 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189014 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189018 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189020 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:59.191132 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189022 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.189028 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189136 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189142 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189145 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189147 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189150 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189153 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189156 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189159 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189161 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189164 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189167 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189170 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189172 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189175 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:37:59.191577 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189177 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189180 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189182 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189185 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189187 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189190 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189192 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189195 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189197 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189200 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189203 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189205 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189208 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189210 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189214 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189217 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189220 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189222 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189225 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:37:59.191983 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189227 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189230 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189232 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189235 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189238 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189240 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189243 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189245 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189248 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189252 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189256 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189259 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189262 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189265 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189268 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189271 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189273 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189276 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189278 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:37:59.192450 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189281 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189283 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189285 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189288 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189290 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189293 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189295 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189298 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189300 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189303 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189305 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189308 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189310 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189313 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189315 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189318 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189320 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189323 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189326 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189329 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:37:59.192909 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189331 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189334 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189336 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189338 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189341 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189346 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189349 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189351 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189354 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189356 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189359 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189361 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189364 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:37:59.189366 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.189371 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:37:59.193392 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.190247 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:37:59.194534 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.194519 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:37:59.195557 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.195546 2575 server.go:1019] "Starting client certificate rotation" Apr 16 20:37:59.195657 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.195639 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:37:59.195702 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.195681 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:37:59.220964 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.220945 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:37:59.225900 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.225886 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:37:59.241834 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.241815 2575 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:37:59.247686 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.247670 2575 log.go:25] "Validated CRI v1 image API" Apr 16 20:37:59.249300 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.249273 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:37:59.249769 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.249754 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:37:59.254198 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.254175 2575 fs.go:135] Filesystem UUIDs: map[0425dc0a-38b9-4892-92e3-cadb01e07ea6:/dev/nvme0n1p4 3ed3f5aa-6ff3-4a5e-82da-9faae95a4c79:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 20:37:59.254242 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.254198 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:37:59.259898 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.259787 2575 manager.go:217] Machine: {Timestamp:2026-04-16 20:37:59.257930018 +0000 UTC m=+0.410026062 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098415 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28f5deadc0a12cc37e4aa83dbbdca3 SystemUUID:ec28f5de-adc0-a12c-c37e-4aa83dbbdca3 BootID:d0174668-f6b4-40fa-b19e-e4551953ab15 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:40:85:6f:24:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:40:85:6f:24:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7a:eb:fc:34:b2:d5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:37:59.259898 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.259893 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:37:59.259999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.259973 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:37:59.261110 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.261085 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:37:59.261240 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.261112 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-79.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:37:59.261286 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.261249 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:37:59.261286 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.261256 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:37:59.261286 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.261270 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:37:59.262209 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.262199 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:37:59.263099 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.263087 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:37:59.263207 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.263198 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:37:59.265634 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.265624 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:37:59.265692 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.265650 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:37:59.265692 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.265673 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:37:59.265692 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.265684 2575 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:37:59.265822 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.265696 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:37:59.266819 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.266804 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:37:59.266855 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.266833 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:37:59.268213 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.268193 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wrl47" Apr 16 20:37:59.269951 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.269926 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:37:59.271343 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.271327 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:37:59.273151 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273136 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273160 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273169 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273178 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273186 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273196 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273204 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:37:59.273220 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273215 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:37:59.273451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273229 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:37:59.273451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273238 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:37:59.273451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273256 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:37:59.273451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273270 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:37:59.273451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.273280 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wrl47" Apr 16 20:37:59.274235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.274223 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:37:59.274292 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.274263 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:37:59.278187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.278158 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:37:59.278276 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.278209 2575 server.go:1295] "Started kubelet" Apr 16 20:37:59.278332 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.278311 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:37:59.278742 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.278706 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:37:59.278830 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.278770 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:37:59.279197 ip-10-0-134-79 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:37:59.280788 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.280773 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:37:59.281758 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.281744 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:37:59.283162 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.283145 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:59.285048 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.285030 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-79.ec2.internal" not found Apr 16 20:37:59.285136 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.285106 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:59.287032 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287013 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:37:59.287125 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287020 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:37:59.287699 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287675 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:37:59.287699 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287678 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:37:59.287856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287711 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:37:59.287898 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287855 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:37:59.287898 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.287867 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:37:59.287898 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.287853 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-79.ec2.internal\" not found" Apr 16 20:37:59.289064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.289048 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:59.290090 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290074 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:37:59.290182 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290173 2575 factory.go:55] Registering systemd factory Apr 16 20:37:59.290257 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290249 2575 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:37:59.290938 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290910 2575 factory.go:153] Registering CRI-O factory Apr 16 20:37:59.290938 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290926 2575 factory.go:223] Registration of the crio container factory successfully Apr 16 20:37:59.291079 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290955 2575 factory.go:103] Registering Raw factory Apr 16 20:37:59.291079 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.290968 2575 manager.go:1196] Started watching for new ooms in manager Apr 16 20:37:59.291406 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.291386 2575 manager.go:319] Starting recovery of all containers Apr 16 20:37:59.291546 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.291521 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:37:59.293584 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.293417 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-79.ec2.internal\" not found" node="ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.300629 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.300603 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-79.ec2.internal" not found Apr 16 20:37:59.301654 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.301641 2575 manager.go:324] Recovery completed Apr 16 20:37:59.305852 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.305840 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:59.307731 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.307720 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:59.307817 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.307743 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:59.307817 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.307753 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:59.308209 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.308196 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:37:59.308209 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.308207 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:37:59.308290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.308223 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:37:59.310755 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.310743 2575 policy_none.go:49] "None policy: Start" Apr 16 20:37:59.310794 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.310760 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:37:59.310794 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.310770 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:37:59.351230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351214 2575 manager.go:341] "Starting Device Plugin manager" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.351245 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351257 2575 server.go:85] "Starting device plugin registration server" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351495 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351509 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351595 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351688 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.351697 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.352188 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.352227 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-79.ec2.internal\" not found" Apr 16 20:37:59.360251 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.358474 2575 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-134-79.ec2.internal" not found Apr 16 20:37:59.419719 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.419686 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:37:59.420868 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.420849 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:37:59.420868 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.420871 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:37:59.420984 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.420890 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:37:59.420984 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.420897 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:37:59.420984 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:37:59.420927 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:37:59.424247 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.424205 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:37:59.452061 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.452033 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:37:59.453868 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.453853 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:37:59.453934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.453883 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:37:59.453934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.453896 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:37:59.453934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.453918 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.464166 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.464145 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.520999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.520960 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal"] Apr 16 20:37:59.523237 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.523221 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.523320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.523222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.544107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.544087 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.547959 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.547945 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.557852 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.557831 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:37:59.557934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.557836 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:37:59.589392 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.589367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2b4e123e8fb6c71c1bd1455d7a290beb-config\") pod \"kube-apiserver-proxy-ip-10-0-134-79.ec2.internal\" (UID: \"2b4e123e8fb6c71c1bd1455d7a290beb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.589474 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.589399 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7e735f8895e509e062711b5aa81a42b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal\" (UID: \"a7e735f8895e509e062711b5aa81a42b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.589474 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.589425 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7e735f8895e509e062711b5aa81a42b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal\" (UID: \"a7e735f8895e509e062711b5aa81a42b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.690367 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.690312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2b4e123e8fb6c71c1bd1455d7a290beb-config\") pod \"kube-apiserver-proxy-ip-10-0-134-79.ec2.internal\" (UID: \"2b4e123e8fb6c71c1bd1455d7a290beb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.690367 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.690336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2b4e123e8fb6c71c1bd1455d7a290beb-config\") pod \"kube-apiserver-proxy-ip-10-0-134-79.ec2.internal\" (UID: \"2b4e123e8fb6c71c1bd1455d7a290beb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.690477 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.690374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7e735f8895e509e062711b5aa81a42b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal\" (UID: \"a7e735f8895e509e062711b5aa81a42b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.690477 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.690400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7e735f8895e509e062711b5aa81a42b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal\" (UID: \"a7e735f8895e509e062711b5aa81a42b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.690477 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.690426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7e735f8895e509e062711b5aa81a42b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal\" (UID: \"a7e735f8895e509e062711b5aa81a42b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.690477 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.690469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a7e735f8895e509e062711b5aa81a42b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal\" (UID: \"a7e735f8895e509e062711b5aa81a42b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.860585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.860560 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" Apr 16 20:37:59.860710 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:37:59.860637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" Apr 16 20:38:00.195345 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.195322 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:38:00.196130 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.195433 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:38:00.196130 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.195481 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:38:00.196130 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.195479 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:38:00.266421 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.266396 2575 apiserver.go:52] "Watching apiserver" Apr 16 20:38:00.273143 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.273119 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:38:00.274463 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.274440 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt","openshift-cluster-node-tuning-operator/tuned-dhpfd","openshift-dns/node-resolver-jq26v","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal","openshift-multus/multus-additional-cni-plugins-jltbp","openshift-multus/multus-ghlvc","openshift-multus/network-metrics-daemon-5jhhm","kube-system/konnectivity-agent-mhc66","openshift-image-registry/node-ca-5z2mk","openshift-network-diagnostics/network-check-target-84xkv","openshift-network-operator/iptables-alerter-fgp5z","openshift-ovn-kubernetes/ovnkube-node-sj6sh"] Apr 16 20:38:00.276272 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.274828 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:32:59 +0000 UTC" deadline="2027-11-07 13:24:33.160722586 +0000 UTC" Apr 16 20:38:00.276272 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.274892 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13672h46m32.885835698s" Apr 16 20:38:00.278200 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.278180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.280407 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.280387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.280506 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.280494 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-72qsr\"" Apr 16 20:38:00.280661 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.280641 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.280775 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.280761 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:38:00.280814 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.280789 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.282676 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.282659 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-j7x4g\"" Apr 16 20:38:00.282781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.282663 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.282781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.282701 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.283039 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.283018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.284924 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.284900 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.285143 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.284912 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.285143 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.285004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nn288\"" Apr 16 20:38:00.286038 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.286020 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.287646 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.287629 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:38:00.288076 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.288059 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:38:00.288160 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.288085 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5jhxj\"" Apr 16 20:38:00.288218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.288196 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:38:00.288336 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.288319 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:38:00.288386 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.288375 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.288422 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.288322 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.290334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.290318 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.290434 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.290415 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:00.290512 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.290490 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:00.292334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.292316 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:38:00.292412 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.292387 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d2jlp\"" Apr 16 20:38:00.292893 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.292877 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.293840 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-systemd\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.293916 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-lib-modules\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.293916 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-tuned\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.293916 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-netns\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.293916 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293906 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-cni-multus\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-hostroot\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-kubernetes\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.294107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.293976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-system-cni-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-conf-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294091 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64e562da-5321-4973-a365-e8c0d198b8cc-multus-daemon-config\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-etc-kubernetes\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294131 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/838cdbbd-45af-4493-a167-65bd220c03c8-tmp-dir\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-socket-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-var-lib-kubelet\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294234 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/838cdbbd-45af-4493-a167-65bd220c03c8-hosts-file\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294263 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-cnibin\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-sys-fs\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.294331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-host\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b225\" (UniqueName: \"kubernetes.io/projected/838cdbbd-45af-4493-a167-65bd220c03c8-kube-api-access-2b225\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-os-release\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ztmp\" (UniqueName: \"kubernetes.io/projected/03f24485-95a9-4251-9d14-8bcb63f82514-kube-api-access-8ztmp\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294434 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-os-release\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-k8s-cni-cncf-io\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-registration-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-etc-selinux\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-run\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gp9x\" (UniqueName: \"kubernetes.io/projected/bce2cf43-152e-43e1-b1ff-a36bd77270cc-kube-api-access-9gp9x\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-cnibin\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.294695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64e562da-5321-4973-a365-e8c0d198b8cc-cni-binary-copy\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-cni-bin\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-cni-binary-copy\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294795 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-multus-certs\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysconfig\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysctl-d\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-device-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-modprobe-d\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.294988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-cni-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-kubelet\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-sys\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295095 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bce2cf43-152e-43e1-b1ff-a36bd77270cc-tmp\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-system-cni-dir\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295145 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:38:00.295187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295194 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-socket-dir-parent\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295220 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295225 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pqg\" (UniqueName: \"kubernetes.io/projected/64e562da-5321-4973-a365-e8c0d198b8cc-kube-api-access-m5pqg\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2mt\" (UniqueName: \"kubernetes.io/projected/68a487f8-1b29-4712-a91a-ce41362dce50-kube-api-access-cb2mt\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysctl-conf\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295196 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7wz76\"" Apr 16 20:38:00.295882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.295870 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.297032 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.297012 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:38:00.298096 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.298031 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-48c8z\"" Apr 16 20:38:00.298224 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.298202 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.298422 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.298397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:38:00.298583 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.298412 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.301136 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.301117 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:00.301240 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.301182 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:00.301240 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.301228 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.303788 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.303764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-kqz7s\"" Apr 16 20:38:00.304035 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.304016 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:38:00.304214 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.304031 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.304296 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.304034 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.304910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.304893 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.307282 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307259 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:38:00.307363 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307294 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-bdfcd\"" Apr 16 20:38:00.307363 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:38:00.307479 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307263 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:38:00.307978 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307949 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:38:00.307978 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307961 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:38:00.308098 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.307985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:38:00.318016 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.317998 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mwfw6" Apr 16 20:38:00.325442 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.325420 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mwfw6" Apr 16 20:38:00.389041 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.389023 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:38:00.395753 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395733 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovnkube-script-lib\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.395856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-etc-selinux\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.395856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-cni-bin\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.395856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-var-lib-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.395856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395839 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-systemd\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-cni-bin\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-etc-selinux\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysctl-d\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-device-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-device-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.395979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-modprobe-d\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pqg\" (UniqueName: \"kubernetes.io/projected/64e562da-5321-4973-a365-e8c0d198b8cc-kube-api-access-m5pqg\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysctl-d\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396059 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396087 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bce2cf43-152e-43e1-b1ff-a36bd77270cc-tmp\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-system-cni-dir\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-modprobe-d\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-host-slash\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396266 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-run-netns\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-node-log\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.396361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396321 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-systemd\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-lib-modules\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-systemd\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-hostroot\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-system-cni-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396446 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-etc-kubernetes\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396493 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-lib-modules\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-socket-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396541 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-hostroot\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/838cdbbd-45af-4493-a167-65bd220c03c8-hosts-file\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396587 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-run-ovn-kubernetes\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-system-cni-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-etc-kubernetes\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-socket-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396739 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/838cdbbd-45af-4493-a167-65bd220c03c8-hosts-file\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-host\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.396880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-host\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.396816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b225\" (UniqueName: \"kubernetes.io/projected/838cdbbd-45af-4493-a167-65bd220c03c8-kube-api-access-2b225\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397048 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-os-release\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ztmp\" (UniqueName: \"kubernetes.io/projected/03f24485-95a9-4251-9d14-8bcb63f82514-kube-api-access-8ztmp\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-system-cni-dir\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397126 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-log-socket\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-env-overrides\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-os-release\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397178 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-registration-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397213 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-run\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-registration-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gp9x\" (UniqueName: \"kubernetes.io/projected/bce2cf43-152e-43e1-b1ff-a36bd77270cc-kube-api-access-9gp9x\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397264 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-cnibin\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-run\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397288 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64e562da-5321-4973-a365-e8c0d198b8cc-cni-binary-copy\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-multus-certs\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397355 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-multus-certs\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.397632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397359 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-cnibin\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qkwg\" (UniqueName: \"kubernetes.io/projected/c00ff0b8-9f9c-418d-854c-b22bc6be761f-kube-api-access-5qkwg\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-iptables-alerter-script\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-cni-binary-copy\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-etc-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397720 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72f4\" (UniqueName: \"kubernetes.io/projected/98723067-9cd3-42a6-a577-2ecd3fc29ae9-kube-api-access-g72f4\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysconfig\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-cni-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-kubelet\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/64e562da-5321-4973-a365-e8c0d198b8cc-cni-binary-copy\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397903 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-cni-binary-copy\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.397969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-sys\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-kubelet\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398024 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-socket-dir-parent\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.398415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398053 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wgqw\" (UniqueName: \"kubernetes.io/projected/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-kube-api-access-5wgqw\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-kubelet\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-cni-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398013 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysconfig\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-socket-dir-parent\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2mt\" (UniqueName: \"kubernetes.io/projected/68a487f8-1b29-4712-a91a-ce41362dce50-kube-api-access-cb2mt\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398174 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-sys\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398214 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysctl-conf\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-tuned\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398749 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-netns\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-cni-multus\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-conf-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.398993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-sysctl-conf\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/03f24485-95a9-4251-9d14-8bcb63f82514-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-netns\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-var-lib-cni-multus\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-multus-conf-dir\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64e562da-5321-4973-a365-e8c0d198b8cc-multus-daemon-config\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399494 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c00ff0b8-9f9c-418d-854c-b22bc6be761f-host\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-kubernetes\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c00ff0b8-9f9c-418d-854c-b22bc6be761f-serviceca\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8b75823b-b08a-433e-8dc2-46c97484a213-agent-certs\") pod \"konnectivity-agent-mhc66\" (UID: \"8b75823b-b08a-433e-8dc2-46c97484a213\") " pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bce2cf43-152e-43e1-b1ff-a36bd77270cc-tmp\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-cni-bin\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-cni-netd\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovn-node-metrics-cert\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399760 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-kubernetes\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/64e562da-5321-4973-a365-e8c0d198b8cc-multus-daemon-config\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/838cdbbd-45af-4493-a167-65bd220c03c8-tmp-dir\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.399910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399865 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-var-lib-kubelet\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-cnibin\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8b75823b-b08a-433e-8dc2-46c97484a213-konnectivity-ca\") pod \"konnectivity-agent-mhc66\" (UID: \"8b75823b-b08a-433e-8dc2-46c97484a213\") " pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-ovn\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.399974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bce2cf43-152e-43e1-b1ff-a36bd77270cc-var-lib-kubelet\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400023 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovnkube-config\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400067 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/838cdbbd-45af-4493-a167-65bd220c03c8-tmp-dir\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/03f24485-95a9-4251-9d14-8bcb63f82514-cnibin\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-sys-fs\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-os-release\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400255 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/68a487f8-1b29-4712-a91a-ce41362dce50-sys-fs\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-os-release\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-k8s-cni-cncf-io\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400539 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtt5z\" (UniqueName: \"kubernetes.io/projected/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-kube-api-access-mtt5z\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-systemd-units\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400573 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/64e562da-5321-4973-a365-e8c0d198b8cc-host-run-k8s-cni-cncf-io\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.400760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.400603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-slash\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.402362 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.402326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bce2cf43-152e-43e1-b1ff-a36bd77270cc-etc-tuned\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.404529 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.404505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ztmp\" (UniqueName: \"kubernetes.io/projected/03f24485-95a9-4251-9d14-8bcb63f82514-kube-api-access-8ztmp\") pod \"multus-additional-cni-plugins-jltbp\" (UID: \"03f24485-95a9-4251-9d14-8bcb63f82514\") " pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.404999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.404983 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b225\" (UniqueName: \"kubernetes.io/projected/838cdbbd-45af-4493-a167-65bd220c03c8-kube-api-access-2b225\") pod \"node-resolver-jq26v\" (UID: \"838cdbbd-45af-4493-a167-65bd220c03c8\") " pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.405493 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.405477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pqg\" (UniqueName: \"kubernetes.io/projected/64e562da-5321-4973-a365-e8c0d198b8cc-kube-api-access-m5pqg\") pod \"multus-ghlvc\" (UID: \"64e562da-5321-4973-a365-e8c0d198b8cc\") " pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.407786 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.407685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gp9x\" (UniqueName: \"kubernetes.io/projected/bce2cf43-152e-43e1-b1ff-a36bd77270cc-kube-api-access-9gp9x\") pod \"tuned-dhpfd\" (UID: \"bce2cf43-152e-43e1-b1ff-a36bd77270cc\") " pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.409013 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.408994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2mt\" (UniqueName: \"kubernetes.io/projected/68a487f8-1b29-4712-a91a-ce41362dce50-kube-api-access-cb2mt\") pod \"aws-ebs-csi-driver-node-cpmlt\" (UID: \"68a487f8-1b29-4712-a91a-ce41362dce50\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.418529 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.418505 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e735f8895e509e062711b5aa81a42b.slice/crio-e41fe0122ac16e6ae314198f885077b7d97027a64e39cf7a4384798f544b11bb WatchSource:0}: Error finding container e41fe0122ac16e6ae314198f885077b7d97027a64e39cf7a4384798f544b11bb: Status 404 returned error can't find the container with id e41fe0122ac16e6ae314198f885077b7d97027a64e39cf7a4384798f544b11bb Apr 16 20:38:00.418940 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.418890 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b4e123e8fb6c71c1bd1455d7a290beb.slice/crio-ebbb93df0aede5f75a2bf1dbf1ba046df81d56b29ab77d53c18b38cc606149cf WatchSource:0}: Error finding container ebbb93df0aede5f75a2bf1dbf1ba046df81d56b29ab77d53c18b38cc606149cf: Status 404 returned error can't find the container with id ebbb93df0aede5f75a2bf1dbf1ba046df81d56b29ab77d53c18b38cc606149cf Apr 16 20:38:00.423232 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.423217 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:38:00.423647 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.423582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" event={"ID":"a7e735f8895e509e062711b5aa81a42b","Type":"ContainerStarted","Data":"e41fe0122ac16e6ae314198f885077b7d97027a64e39cf7a4384798f544b11bb"} Apr 16 20:38:00.424599 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.424577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" event={"ID":"2b4e123e8fb6c71c1bd1455d7a290beb","Type":"ContainerStarted","Data":"ebbb93df0aede5f75a2bf1dbf1ba046df81d56b29ab77d53c18b38cc606149cf"} Apr 16 20:38:00.501281 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501281 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-run-ovn-kubernetes\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501281 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-log-socket\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501292 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-env-overrides\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qkwg\" (UniqueName: \"kubernetes.io/projected/c00ff0b8-9f9c-418d-854c-b22bc6be761f-kube-api-access-5qkwg\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-iptables-alerter-script\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501352 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-run-ovn-kubernetes\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501357 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-log-socket\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-etc-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g72f4\" (UniqueName: \"kubernetes.io/projected/98723067-9cd3-42a6-a577-2ecd3fc29ae9-kube-api-access-g72f4\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wgqw\" (UniqueName: \"kubernetes.io/projected/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-kube-api-access-5wgqw\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-etc-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-kubelet\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-kubelet\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501639 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c00ff0b8-9f9c-418d-854c-b22bc6be761f-host\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c00ff0b8-9f9c-418d-854c-b22bc6be761f-serviceca\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c00ff0b8-9f9c-418d-854c-b22bc6be761f-host\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8b75823b-b08a-433e-8dc2-46c97484a213-agent-certs\") pod \"konnectivity-agent-mhc66\" (UID: \"8b75823b-b08a-433e-8dc2-46c97484a213\") " pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.501768 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-cni-bin\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501796 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-env-overrides\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-cni-netd\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovn-node-metrics-cert\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.501853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-cni-bin\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.501866 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:01.001820852 +0000 UTC m=+2.153916884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-cni-netd\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-iptables-alerter-script\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501904 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8b75823b-b08a-433e-8dc2-46c97484a213-konnectivity-ca\") pod \"konnectivity-agent-mhc66\" (UID: \"8b75823b-b08a-433e-8dc2-46c97484a213\") " pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-ovn\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovnkube-config\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501984 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtt5z\" (UniqueName: \"kubernetes.io/projected/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-kube-api-access-mtt5z\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.501991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-ovn\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-systemd-units\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-slash\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-systemd-units\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502078 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovnkube-script-lib\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-var-lib-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-systemd\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c00ff0b8-9f9c-418d-854c-b22bc6be761f-serviceca\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.502675 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502161 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-host-slash\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502204 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-slash\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502236 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-run-systemd\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-host-slash\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-var-lib-openvswitch\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502287 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-run-netns\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-node-log\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-node-log\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/98723067-9cd3-42a6-a577-2ecd3fc29ae9-host-run-netns\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8b75823b-b08a-433e-8dc2-46c97484a213-konnectivity-ca\") pod \"konnectivity-agent-mhc66\" (UID: \"8b75823b-b08a-433e-8dc2-46c97484a213\") " pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovnkube-config\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.503121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.502661 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovnkube-script-lib\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.504434 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.504414 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/98723067-9cd3-42a6-a577-2ecd3fc29ae9-ovn-node-metrics-cert\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.504508 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.504492 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8b75823b-b08a-433e-8dc2-46c97484a213-agent-certs\") pod \"konnectivity-agent-mhc66\" (UID: \"8b75823b-b08a-433e-8dc2-46c97484a213\") " pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.508782 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.508768 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:00.508782 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.508783 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:00.508889 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.508791 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:00.508889 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:00.508852 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:38:01.008836333 +0000 UTC m=+2.160932366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:00.510788 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.510770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72f4\" (UniqueName: \"kubernetes.io/projected/98723067-9cd3-42a6-a577-2ecd3fc29ae9-kube-api-access-g72f4\") pod \"ovnkube-node-sj6sh\" (UID: \"98723067-9cd3-42a6-a577-2ecd3fc29ae9\") " pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.511010 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.510991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wgqw\" (UniqueName: \"kubernetes.io/projected/9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1-kube-api-access-5wgqw\") pod \"iptables-alerter-fgp5z\" (UID: \"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1\") " pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.511476 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.511451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtt5z\" (UniqueName: \"kubernetes.io/projected/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-kube-api-access-mtt5z\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:00.511554 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.511508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qkwg\" (UniqueName: \"kubernetes.io/projected/c00ff0b8-9f9c-418d-854c-b22bc6be761f-kube-api-access-5qkwg\") pod \"node-ca-5z2mk\" (UID: \"c00ff0b8-9f9c-418d-854c-b22bc6be761f\") " pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.610859 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.610832 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" Apr 16 20:38:00.616566 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.616548 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a487f8_1b29_4712_a91a_ce41362dce50.slice/crio-4f05935e8490938e7d270e2c0fd9439846e9db0bee795f9a34e4cf59b1fd8490 WatchSource:0}: Error finding container 4f05935e8490938e7d270e2c0fd9439846e9db0bee795f9a34e4cf59b1fd8490: Status 404 returned error can't find the container with id 4f05935e8490938e7d270e2c0fd9439846e9db0bee795f9a34e4cf59b1fd8490 Apr 16 20:38:00.632421 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.632404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" Apr 16 20:38:00.638200 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.638178 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbce2cf43_152e_43e1_b1ff_a36bd77270cc.slice/crio-e04e15414fa3f977f604c703ebaf7822eb0c9c679b1244ad40467b2a79614171 WatchSource:0}: Error finding container e04e15414fa3f977f604c703ebaf7822eb0c9c679b1244ad40467b2a79614171: Status 404 returned error can't find the container with id e04e15414fa3f977f604c703ebaf7822eb0c9c679b1244ad40467b2a79614171 Apr 16 20:38:00.640918 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.640895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jq26v" Apr 16 20:38:00.646381 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.646360 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod838cdbbd_45af_4493_a167_65bd220c03c8.slice/crio-870dba2f4eb71a62b4ead87d97a02fb5291bd470cf58f778bd2b07100d124fea WatchSource:0}: Error finding container 870dba2f4eb71a62b4ead87d97a02fb5291bd470cf58f778bd2b07100d124fea: Status 404 returned error can't find the container with id 870dba2f4eb71a62b4ead87d97a02fb5291bd470cf58f778bd2b07100d124fea Apr 16 20:38:00.652679 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.652661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jltbp" Apr 16 20:38:00.658840 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.658822 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03f24485_95a9_4251_9d14_8bcb63f82514.slice/crio-0c2a68f05bc41b3d116b4fcda20c19c1bdb99feb6e2e40d60ae6d9534eb96f8a WatchSource:0}: Error finding container 0c2a68f05bc41b3d116b4fcda20c19c1bdb99feb6e2e40d60ae6d9534eb96f8a: Status 404 returned error can't find the container with id 0c2a68f05bc41b3d116b4fcda20c19c1bdb99feb6e2e40d60ae6d9534eb96f8a Apr 16 20:38:00.664920 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.664904 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ghlvc" Apr 16 20:38:00.669865 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.669843 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64e562da_5321_4973_a365_e8c0d198b8cc.slice/crio-601cfed6d3498aa05448af2ed70ac967ff8557d3c6ea5eb348c057c4c875a69f WatchSource:0}: Error finding container 601cfed6d3498aa05448af2ed70ac967ff8557d3c6ea5eb348c057c4c875a69f: Status 404 returned error can't find the container with id 601cfed6d3498aa05448af2ed70ac967ff8557d3c6ea5eb348c057c4c875a69f Apr 16 20:38:00.669928 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.669881 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:00.675626 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.675594 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5z2mk" Apr 16 20:38:00.676562 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.676544 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b75823b_b08a_433e_8dc2_46c97484a213.slice/crio-2db29553e776df7f63db0995a4a77954c1df0365a7ca3c1e07da2947afe7e160 WatchSource:0}: Error finding container 2db29553e776df7f63db0995a4a77954c1df0365a7ca3c1e07da2947afe7e160: Status 404 returned error can't find the container with id 2db29553e776df7f63db0995a4a77954c1df0365a7ca3c1e07da2947afe7e160 Apr 16 20:38:00.682210 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.682054 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00ff0b8_9f9c_418d_854c_b22bc6be761f.slice/crio-92049b0337e3a5e961b6a42b5f155d87049291eec83fe90b74bad3afb6c332d5 WatchSource:0}: Error finding container 92049b0337e3a5e961b6a42b5f155d87049291eec83fe90b74bad3afb6c332d5: Status 404 returned error can't find the container with id 92049b0337e3a5e961b6a42b5f155d87049291eec83fe90b74bad3afb6c332d5 Apr 16 20:38:00.711443 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.711420 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-fgp5z" Apr 16 20:38:00.715002 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:00.714984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:00.716926 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.716904 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8ecd1b_cb7e_45f3_922a_88c539f3f5b1.slice/crio-c476fae93b464b02f816bb80c85c3ed535825d2edfc16f2de0ace799ed8f57fd WatchSource:0}: Error finding container c476fae93b464b02f816bb80c85c3ed535825d2edfc16f2de0ace799ed8f57fd: Status 404 returned error can't find the container with id c476fae93b464b02f816bb80c85c3ed535825d2edfc16f2de0ace799ed8f57fd Apr 16 20:38:00.722535 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:38:00.722516 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98723067_9cd3_42a6_a577_2ecd3fc29ae9.slice/crio-31918faac9aba8a820f578a32b351f5528da49cf298ea73632e5ce173c30b4a8 WatchSource:0}: Error finding container 31918faac9aba8a820f578a32b351f5528da49cf298ea73632e5ce173c30b4a8: Status 404 returned error can't find the container with id 31918faac9aba8a820f578a32b351f5528da49cf298ea73632e5ce173c30b4a8 Apr 16 20:38:01.002659 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.002610 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:38:01.004738 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.004712 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:01.004880 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:01.004863 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:01.004937 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:01.004928 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:02.004910511 +0000 UTC m=+3.157006547 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:01.105523 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.105485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:01.105696 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:01.105662 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:01.105696 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:01.105682 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:01.105696 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:01.105694 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:01.105863 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:01.105753 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:38:02.105734108 +0000 UTC m=+3.257830154 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:01.234203 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.233901 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:38:01.327375 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.327259 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:33:00 +0000 UTC" deadline="2027-11-11 17:22:57.440219907 +0000 UTC" Apr 16 20:38:01.327375 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.327288 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13772h44m56.112935466s" Apr 16 20:38:01.411934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.411745 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:38:01.447802 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.447767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fgp5z" event={"ID":"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1","Type":"ContainerStarted","Data":"c476fae93b464b02f816bb80c85c3ed535825d2edfc16f2de0ace799ed8f57fd"} Apr 16 20:38:01.459500 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.459471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5z2mk" event={"ID":"c00ff0b8-9f9c-418d-854c-b22bc6be761f","Type":"ContainerStarted","Data":"92049b0337e3a5e961b6a42b5f155d87049291eec83fe90b74bad3afb6c332d5"} Apr 16 20:38:01.461531 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.461503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mhc66" event={"ID":"8b75823b-b08a-433e-8dc2-46c97484a213","Type":"ContainerStarted","Data":"2db29553e776df7f63db0995a4a77954c1df0365a7ca3c1e07da2947afe7e160"} Apr 16 20:38:01.472224 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.472198 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jq26v" event={"ID":"838cdbbd-45af-4493-a167-65bd220c03c8","Type":"ContainerStarted","Data":"870dba2f4eb71a62b4ead87d97a02fb5291bd470cf58f778bd2b07100d124fea"} Apr 16 20:38:01.488659 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.488632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" event={"ID":"bce2cf43-152e-43e1-b1ff-a36bd77270cc","Type":"ContainerStarted","Data":"e04e15414fa3f977f604c703ebaf7822eb0c9c679b1244ad40467b2a79614171"} Apr 16 20:38:01.501154 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.501128 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" event={"ID":"68a487f8-1b29-4712-a91a-ce41362dce50","Type":"ContainerStarted","Data":"4f05935e8490938e7d270e2c0fd9439846e9db0bee795f9a34e4cf59b1fd8490"} Apr 16 20:38:01.518260 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.518086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"31918faac9aba8a820f578a32b351f5528da49cf298ea73632e5ce173c30b4a8"} Apr 16 20:38:01.534286 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.534262 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ghlvc" event={"ID":"64e562da-5321-4973-a365-e8c0d198b8cc","Type":"ContainerStarted","Data":"601cfed6d3498aa05448af2ed70ac967ff8557d3c6ea5eb348c057c4c875a69f"} Apr 16 20:38:01.544446 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:01.544422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerStarted","Data":"0c2a68f05bc41b3d116b4fcda20c19c1bdb99feb6e2e40d60ae6d9534eb96f8a"} Apr 16 20:38:02.013073 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:02.013038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:02.013251 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.013189 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:02.013334 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.013251 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:04.013232412 +0000 UTC m=+5.165328444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:02.114031 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:02.113992 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:02.114206 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.114152 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:02.114206 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.114171 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:02.114206 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.114184 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:02.114366 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.114241 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:38:04.114221354 +0000 UTC m=+5.266317402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:02.328407 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:02.328318 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:33:00 +0000 UTC" deadline="2027-10-03 22:14:36.317982146 +0000 UTC" Apr 16 20:38:02.328407 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:02.328356 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12841h36m33.989629545s" Apr 16 20:38:02.421536 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:02.421495 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:02.421732 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.421642 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:02.422065 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:02.422040 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:02.422159 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:02.422142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:04.028719 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:04.028664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:04.029179 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.028826 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:04.029179 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.028890 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:08.02886951 +0000 UTC m=+9.180965547 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:04.129699 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:04.129594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:04.129824 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.129776 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:04.129824 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.129801 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:04.129824 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.129815 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:04.129977 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.129877 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:38:08.12985836 +0000 UTC m=+9.281954395 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:04.422666 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:04.421974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:04.422666 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.422121 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:04.422666 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:04.422499 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:04.422666 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:04.422603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:06.421800 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:06.421765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:06.422276 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:06.421898 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:06.422276 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:06.421765 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:06.422386 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:06.422365 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:08.062321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:08.062281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:08.062781 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.062416 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:08.062781 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.062493 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:16.062472776 +0000 UTC m=+17.214568812 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:08.163256 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:08.163216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:08.163422 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.163400 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:08.163497 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.163430 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:08.163497 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.163443 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:08.163603 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.163514 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:38:16.163490268 +0000 UTC m=+17.315586324 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:08.421828 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:08.421758 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:08.422021 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:08.421850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:08.422021 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.421980 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:08.422302 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:08.422248 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:10.421437 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:10.421409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:10.421860 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:10.421377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:10.421860 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:10.421579 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:10.421860 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:10.421661 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:12.421370 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:12.421336 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:12.421814 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:12.421338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:12.421814 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:12.421452 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:12.421814 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:12.421523 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:14.421585 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:14.421556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:14.422029 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:14.421557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:14.422029 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:14.421663 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:14.422029 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:14.421769 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:16.126094 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:16.126058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:16.126572 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.126213 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:16.126572 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.126275 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:32.126259913 +0000 UTC m=+33.278355943 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:16.226573 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:16.226535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:16.226763 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.226727 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:16.226763 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.226752 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:16.226763 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.226765 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:16.226917 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.226830 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:38:32.226810794 +0000 UTC m=+33.378906843 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:16.421787 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:16.421714 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:16.421940 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:16.421721 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:16.421940 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.421826 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:16.422033 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:16.421931 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:18.421581 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:18.421554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:18.421931 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:18.421554 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:18.421931 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:18.421686 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:18.421931 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:18.421808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:19.578464 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.578111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" event={"ID":"2b4e123e8fb6c71c1bd1455d7a290beb","Type":"ContainerStarted","Data":"5caaea8597002a677c52e54af507984cda6167a5308c0030c5695735782a9440"} Apr 16 20:38:19.579421 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.579400 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5z2mk" event={"ID":"c00ff0b8-9f9c-418d-854c-b22bc6be761f","Type":"ContainerStarted","Data":"0233680e5a97e6682fab9b7cf77a36d55df1c797029ac37c0e3cc310ef3cf667"} Apr 16 20:38:19.580574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.580549 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mhc66" event={"ID":"8b75823b-b08a-433e-8dc2-46c97484a213","Type":"ContainerStarted","Data":"ef9a0fb641238b0b1f97e51711f33ca9eb77d38162fdd5dd82432e8a73368986"} Apr 16 20:38:19.581785 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.581764 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jq26v" event={"ID":"838cdbbd-45af-4493-a167-65bd220c03c8","Type":"ContainerStarted","Data":"70c5288c2920c57def453302354c05b0f53fbe5beea9ceff3177b311517d496e"} Apr 16 20:38:19.582883 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.582854 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" event={"ID":"bce2cf43-152e-43e1-b1ff-a36bd77270cc","Type":"ContainerStarted","Data":"b7f66cfcdf92ec49aa28ac15a3ea49a27655380a017123255a96239903eb5a2f"} Apr 16 20:38:19.584016 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.583993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" event={"ID":"68a487f8-1b29-4712-a91a-ce41362dce50","Type":"ContainerStarted","Data":"a0a07abb81be38eab82a996d068d033e4c69f594dea4d8ed8aeabba3f3d6beaf"} Apr 16 20:38:19.586161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:38:19.586416 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586396 2575 generic.go:358] "Generic (PLEG): container finished" podID="98723067-9cd3-42a6-a577-2ecd3fc29ae9" containerID="1f095223251163652e5137df071adfd86f0cd2fade6e07b250cd225de2b0381d" exitCode=1 Apr 16 20:38:19.586469 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"c921b8b242488bda553cc6c22d08d0faf68c3b4b3a314d00e05a8df0b25a99eb"} Apr 16 20:38:19.586469 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"a13b8ed04759ffabcac700447184150e1316d6ce3d4c2791e4733a740adba70a"} Apr 16 20:38:19.586546 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"2490064eead8db0c3d753e245b2019c3f0449a612dd8a2fd354f78c9ec60cf9e"} Apr 16 20:38:19.586546 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586479 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"8cfe05c6f4c6ed30c8181decde6aae0b07d8a259482e956d2caf32de8ab87d97"} Apr 16 20:38:19.586546 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerDied","Data":"1f095223251163652e5137df071adfd86f0cd2fade6e07b250cd225de2b0381d"} Apr 16 20:38:19.586546 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.586499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"b60a79562d05623bb477d817d17d73181729cfd5ae53b56f9837266c0afca089"} Apr 16 20:38:19.587527 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.587510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ghlvc" event={"ID":"64e562da-5321-4973-a365-e8c0d198b8cc","Type":"ContainerStarted","Data":"a6652216f8e519f0034c59c46fade539333d2a6a114e4b9dc389ae084badf677"} Apr 16 20:38:19.588762 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.588744 2575 generic.go:358] "Generic (PLEG): container finished" podID="03f24485-95a9-4251-9d14-8bcb63f82514" containerID="77389b7badfee9ef6763aef580eb2433e8238211143c72023cff4a821860b755" exitCode=0 Apr 16 20:38:19.588824 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.588803 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerDied","Data":"77389b7badfee9ef6763aef580eb2433e8238211143c72023cff4a821860b755"} Apr 16 20:38:19.589999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.589978 2575 generic.go:358] "Generic (PLEG): container finished" podID="a7e735f8895e509e062711b5aa81a42b" containerID="6b5601cc9c05746ca868de439effd2462c14c1e3b58f1eaddbaa996c67636593" exitCode=0 Apr 16 20:38:19.590045 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.589999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" event={"ID":"a7e735f8895e509e062711b5aa81a42b","Type":"ContainerDied","Data":"6b5601cc9c05746ca868de439effd2462c14c1e3b58f1eaddbaa996c67636593"} Apr 16 20:38:19.592030 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.591985 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-79.ec2.internal" podStartSLOduration=20.591971211 podStartE2EDuration="20.591971211s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:38:19.591796282 +0000 UTC m=+20.743892358" watchObservedRunningTime="2026-04-16 20:38:19.591971211 +0000 UTC m=+20.744067268" Apr 16 20:38:19.643950 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.643887 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jq26v" podStartSLOduration=2.9285154049999997 podStartE2EDuration="20.64387608s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.647808458 +0000 UTC m=+1.799904489" lastFinishedPulling="2026-04-16 20:38:18.36316913 +0000 UTC m=+19.515265164" observedRunningTime="2026-04-16 20:38:19.643842299 +0000 UTC m=+20.795938353" watchObservedRunningTime="2026-04-16 20:38:19.64387608 +0000 UTC m=+20.795972132" Apr 16 20:38:19.662411 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.662366 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ghlvc" podStartSLOduration=2.585527619 podStartE2EDuration="20.662355272s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.671269132 +0000 UTC m=+1.823365163" lastFinishedPulling="2026-04-16 20:38:18.748096775 +0000 UTC m=+19.900192816" observedRunningTime="2026-04-16 20:38:19.66194164 +0000 UTC m=+20.814037692" watchObservedRunningTime="2026-04-16 20:38:19.662355272 +0000 UTC m=+20.814451325" Apr 16 20:38:19.674791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.674758 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mhc66" podStartSLOduration=2.9806079199999997 podStartE2EDuration="20.674748735s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.678663241 +0000 UTC m=+1.830759272" lastFinishedPulling="2026-04-16 20:38:18.372804045 +0000 UTC m=+19.524900087" observedRunningTime="2026-04-16 20:38:19.674551992 +0000 UTC m=+20.826648044" watchObservedRunningTime="2026-04-16 20:38:19.674748735 +0000 UTC m=+20.826844787" Apr 16 20:38:19.687932 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.687904 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5z2mk" podStartSLOduration=11.047554166 podStartE2EDuration="20.687895932s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.683811651 +0000 UTC m=+1.835907682" lastFinishedPulling="2026-04-16 20:38:10.324153401 +0000 UTC m=+11.476249448" observedRunningTime="2026-04-16 20:38:19.68757738 +0000 UTC m=+20.839673444" watchObservedRunningTime="2026-04-16 20:38:19.687895932 +0000 UTC m=+20.839991984" Apr 16 20:38:19.706026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:19.705992 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-dhpfd" podStartSLOduration=2.970850212 podStartE2EDuration="20.705982321s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.639592345 +0000 UTC m=+1.791688376" lastFinishedPulling="2026-04-16 20:38:18.374724443 +0000 UTC m=+19.526820485" observedRunningTime="2026-04-16 20:38:19.705708601 +0000 UTC m=+20.857804694" watchObservedRunningTime="2026-04-16 20:38:19.705982321 +0000 UTC m=+20.858078375" Apr 16 20:38:20.248234 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.248193 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:38:20.361893 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.361797 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:38:20.24821243Z","UUID":"a344498c-f37c-49a5-bd8b-813688bea303","Handler":null,"Name":"","Endpoint":""} Apr 16 20:38:20.363572 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.363545 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:38:20.363572 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.363575 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:38:20.421547 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.421488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:20.421685 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.421488 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:20.421685 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:20.421572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:20.421685 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:20.421672 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:20.594107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.594071 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" event={"ID":"68a487f8-1b29-4712-a91a-ce41362dce50","Type":"ContainerStarted","Data":"3f53e926ea38cbdbb5c9197a7aae874402586600df5f2a874d99d1326b30e8f1"} Apr 16 20:38:20.595901 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.595871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" event={"ID":"a7e735f8895e509e062711b5aa81a42b","Type":"ContainerStarted","Data":"22771f1ecc852472a7a184f0f1b520eeb4081eba18d2caf7ff085476a557b631"} Apr 16 20:38:20.597307 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.597277 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-fgp5z" event={"ID":"9b8ecd1b-cb7e-45f3-922a-88c539f3f5b1","Type":"ContainerStarted","Data":"6c00d70c1575b6358888dadd08eedb59e55ee46bbe19c1aabe8d8880aefbf7d1"} Apr 16 20:38:20.609626 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.609572 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-79.ec2.internal" podStartSLOduration=21.609560575 podStartE2EDuration="21.609560575s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:38:20.60939426 +0000 UTC m=+21.761490315" watchObservedRunningTime="2026-04-16 20:38:20.609560575 +0000 UTC m=+21.761656628" Apr 16 20:38:20.622024 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:20.621989 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-fgp5z" podStartSLOduration=3.968983298 podStartE2EDuration="21.62197939s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.719844283 +0000 UTC m=+1.871940329" lastFinishedPulling="2026-04-16 20:38:18.372840375 +0000 UTC m=+19.524936421" observedRunningTime="2026-04-16 20:38:20.621523717 +0000 UTC m=+21.773619773" watchObservedRunningTime="2026-04-16 20:38:20.62197939 +0000 UTC m=+21.774075445" Apr 16 20:38:21.600802 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:21.600576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" event={"ID":"68a487f8-1b29-4712-a91a-ce41362dce50","Type":"ContainerStarted","Data":"cf8b481dc3a6a23a848c06b307edb873c8c78b36ee10950c907954abf95a8b70"} Apr 16 20:38:21.603402 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:21.603383 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:38:21.603812 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:21.603786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"49da737e2119c2712d0461fbedc3724815bbb9b56d62efe7f4a3dcf593ee7db6"} Apr 16 20:38:21.769857 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:21.769829 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:21.770397 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:21.770375 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:21.784548 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:21.784508 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpmlt" podStartSLOduration=2.35075512 podStartE2EDuration="22.784492867s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.618089063 +0000 UTC m=+1.770185093" lastFinishedPulling="2026-04-16 20:38:21.051826793 +0000 UTC m=+22.203922840" observedRunningTime="2026-04-16 20:38:21.6177415 +0000 UTC m=+22.769837554" watchObservedRunningTime="2026-04-16 20:38:21.784492867 +0000 UTC m=+22.936588922" Apr 16 20:38:22.421671 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:22.421636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:22.421838 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:22.421636 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:22.421838 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:22.421751 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:22.421955 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:22.421864 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:22.605858 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:22.605826 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:22.606406 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:22.606374 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mhc66" Apr 16 20:38:24.421469 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.421313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:24.421887 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.421326 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:24.421887 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:24.421541 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:24.421887 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:24.421630 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:24.611110 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.611088 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:38:24.611395 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.611374 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"49f93be5b794abba130afe1678df8250191200e18f67155b3b41687ad815c991"} Apr 16 20:38:24.611652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.611602 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:24.611652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.611652 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:24.611882 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.611860 2575 scope.go:117] "RemoveContainer" containerID="1f095223251163652e5137df071adfd86f0cd2fade6e07b250cd225de2b0381d" Apr 16 20:38:24.612949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.612922 2575 generic.go:358] "Generic (PLEG): container finished" podID="03f24485-95a9-4251-9d14-8bcb63f82514" containerID="034be42a210a11fc243d529c68ef4eda1e9bc418290edd4df36530988fdf0d33" exitCode=0 Apr 16 20:38:24.613024 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.612999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerDied","Data":"034be42a210a11fc243d529c68ef4eda1e9bc418290edd4df36530988fdf0d33"} Apr 16 20:38:24.626958 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:24.626938 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:25.618760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:25.618738 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:38:25.619107 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:25.619067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" event={"ID":"98723067-9cd3-42a6-a577-2ecd3fc29ae9","Type":"ContainerStarted","Data":"3fc1cf0173229e84a25fa623e75fce1b6f20ebddb2d638f28cd25722f1604cad"} Apr 16 20:38:25.619448 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:25.619429 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:25.635753 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:25.635539 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:38:25.647078 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:25.647040 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" podStartSLOduration=8.964304625 podStartE2EDuration="26.647023431s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.723800267 +0000 UTC m=+1.875896299" lastFinishedPulling="2026-04-16 20:38:18.406519067 +0000 UTC m=+19.558615105" observedRunningTime="2026-04-16 20:38:25.645582769 +0000 UTC m=+26.797678834" watchObservedRunningTime="2026-04-16 20:38:25.647023431 +0000 UTC m=+26.799119484" Apr 16 20:38:26.087354 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:26.087177 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5jhhm"] Apr 16 20:38:26.087537 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:26.087452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:26.087583 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:26.087550 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:26.090210 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:26.090182 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-84xkv"] Apr 16 20:38:26.090283 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:26.090276 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:26.090360 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:26.090344 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:26.622913 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:26.622884 2575 generic.go:358] "Generic (PLEG): container finished" podID="03f24485-95a9-4251-9d14-8bcb63f82514" containerID="7ed2d37857acb83b541ab8cf5d145f6befdc450cb9e4b653ecd132efc86d91b8" exitCode=0 Apr 16 20:38:26.623304 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:26.622974 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerDied","Data":"7ed2d37857acb83b541ab8cf5d145f6befdc450cb9e4b653ecd132efc86d91b8"} Apr 16 20:38:27.422061 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:27.422036 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:27.422196 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:27.422035 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:27.422196 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:27.422136 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:27.422276 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:27.422232 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:27.630874 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:27.630844 2575 generic.go:358] "Generic (PLEG): container finished" podID="03f24485-95a9-4251-9d14-8bcb63f82514" containerID="1d3072b57c6014584c634f303cbb5ff931b444524ca6ea77171c64c3e7827995" exitCode=0 Apr 16 20:38:27.631631 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:27.630932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerDied","Data":"1d3072b57c6014584c634f303cbb5ff931b444524ca6ea77171c64c3e7827995"} Apr 16 20:38:29.422874 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:29.422844 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:29.423375 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:29.422937 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:29.423375 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:29.422995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:29.423375 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:29.423129 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:31.421278 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.421245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:31.421693 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:31.421390 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:38:31.421693 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.421454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:31.421693 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:31.421559 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84xkv" podUID="6492104e-0c2b-4f5b-bd8f-98d40e48a78e" Apr 16 20:38:31.674359 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.674333 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-79.ec2.internal" event="NodeReady" Apr 16 20:38:31.674534 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.674470 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:38:31.718035 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.717854 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7lm8k"] Apr 16 20:38:31.737055 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.737020 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f9kkl"] Apr 16 20:38:31.737205 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.737093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.739635 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.739481 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:38:31.739635 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.739489 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:38:31.739805 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.739794 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wwh4f\"" Apr 16 20:38:31.749218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.749198 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7lm8k"] Apr 16 20:38:31.749303 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.749225 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f9kkl"] Apr 16 20:38:31.749360 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.749324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:31.751567 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.751543 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:38:31.751710 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.751594 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:38:31.751710 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.751604 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7vq8v\"" Apr 16 20:38:31.751710 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.751600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:38:31.840435 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.840392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:31.840638 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.840463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa099128-e7b3-453f-a700-69d4e48f8448-tmp-dir\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.840638 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.840495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djdkj\" (UniqueName: \"kubernetes.io/projected/35692de4-3b87-4697-b519-4f55d1e81778-kube-api-access-djdkj\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:31.840638 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.840519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.840638 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.840550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa099128-e7b3-453f-a700-69d4e48f8448-config-volume\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.840823 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.840697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7d65\" (UniqueName: \"kubernetes.io/projected/aa099128-e7b3-453f-a700-69d4e48f8448-kube-api-access-t7d65\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.941167 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7d65\" (UniqueName: \"kubernetes.io/projected/aa099128-e7b3-453f-a700-69d4e48f8448-kube-api-access-t7d65\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.941167 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941129 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:31.941354 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa099128-e7b3-453f-a700-69d4e48f8448-tmp-dir\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.941354 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djdkj\" (UniqueName: \"kubernetes.io/projected/35692de4-3b87-4697-b519-4f55d1e81778-kube-api-access-djdkj\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:31.941354 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941229 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.941354 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa099128-e7b3-453f-a700-69d4e48f8448-config-volume\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.941354 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:31.941297 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:31.941568 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:31.941363 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:32.441343873 +0000 UTC m=+33.593439917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:38:31.941568 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:31.941469 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:31.941568 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:31.941532 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:32.441514705 +0000 UTC m=+33.593610745 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:38:31.941742 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941574 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa099128-e7b3-453f-a700-69d4e48f8448-tmp-dir\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.941957 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.941933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aa099128-e7b3-453f-a700-69d4e48f8448-config-volume\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.952031 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.952007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7d65\" (UniqueName: \"kubernetes.io/projected/aa099128-e7b3-453f-a700-69d4e48f8448-kube-api-access-t7d65\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:31.952031 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:31.952020 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djdkj\" (UniqueName: \"kubernetes.io/projected/35692de4-3b87-4697-b519-4f55d1e81778-kube-api-access-djdkj\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:32.142595 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:32.142556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:32.142797 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.142734 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:32.142864 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.142850 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:04.142828766 +0000 UTC m=+65.294924801 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:38:32.243350 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:32.243238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:32.243484 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.243399 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:38:32.243484 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.243426 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:38:32.243484 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.243436 2575 projected.go:194] Error preparing data for projected volume kube-api-access-nwsz5 for pod openshift-network-diagnostics/network-check-target-84xkv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:32.243673 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.243487 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5 podName:6492104e-0c2b-4f5b-bd8f-98d40e48a78e nodeName:}" failed. No retries permitted until 2026-04-16 20:39:04.243471141 +0000 UTC m=+65.395567179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-nwsz5" (UniqueName: "kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5") pod "network-check-target-84xkv" (UID: "6492104e-0c2b-4f5b-bd8f-98d40e48a78e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:38:32.445494 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:32.445454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:32.446039 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:32.445564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:32.446039 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.445647 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:32.446039 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.445685 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:32.446039 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.445725 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:33.445704998 +0000 UTC m=+34.597801036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:38:32.446039 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:32.445750 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:33.445733781 +0000 UTC m=+34.597829813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:38:33.424626 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.424592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:38:33.424835 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.424592 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:38:33.427420 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.427397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:38:33.427570 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.427430 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:38:33.427570 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.427461 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:38:33.428449 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.428427 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-484zf\"" Apr 16 20:38:33.428561 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.428467 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vdj2n\"" Apr 16 20:38:33.453843 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.453820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:33.454104 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:33.453853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:33.454104 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:33.453938 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:33.454104 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:33.453943 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:33.454104 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:33.453989 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:35.453977209 +0000 UTC m=+36.606073240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:38:33.454104 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:33.454002 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:35.453996359 +0000 UTC m=+36.606092390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:38:34.645516 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:34.645482 2575 generic.go:358] "Generic (PLEG): container finished" podID="03f24485-95a9-4251-9d14-8bcb63f82514" containerID="16362562a38a86e8d2751230d4bd778faf9702e1958428291beedc0780a3ef9a" exitCode=0 Apr 16 20:38:34.645876 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:34.645528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerDied","Data":"16362562a38a86e8d2751230d4bd778faf9702e1958428291beedc0780a3ef9a"} Apr 16 20:38:35.466057 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:35.466026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:35.466190 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:35.466086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:35.466190 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:35.466154 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:35.466190 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:35.466161 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:35.466305 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:35.466200 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:39.466187434 +0000 UTC m=+40.618283464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:38:35.466305 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:35.466212 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:39.466206175 +0000 UTC m=+40.618302206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:38:35.649970 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:35.649944 2575 generic.go:358] "Generic (PLEG): container finished" podID="03f24485-95a9-4251-9d14-8bcb63f82514" containerID="35e9b79c861802f494e3fafc40e14a922619de2ae74e47d7f751340ff3364e81" exitCode=0 Apr 16 20:38:35.650280 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:35.649978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerDied","Data":"35e9b79c861802f494e3fafc40e14a922619de2ae74e47d7f751340ff3364e81"} Apr 16 20:38:36.654141 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:36.654103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jltbp" event={"ID":"03f24485-95a9-4251-9d14-8bcb63f82514","Type":"ContainerStarted","Data":"b1de287221f2d34a77b9c63493151ddbef16c6afc3a6bb889914852f8a62b5ee"} Apr 16 20:38:36.676151 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:36.676102 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jltbp" podStartSLOduration=4.423350833 podStartE2EDuration="37.676090666s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:38:00.660532068 +0000 UTC m=+1.812628099" lastFinishedPulling="2026-04-16 20:38:33.913271896 +0000 UTC m=+35.065367932" observedRunningTime="2026-04-16 20:38:36.675884235 +0000 UTC m=+37.827980311" watchObservedRunningTime="2026-04-16 20:38:36.676090666 +0000 UTC m=+37.828186718" Apr 16 20:38:39.493610 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:39.493579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:39.493971 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:39.493664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:39.493971 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:39.493709 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:39.493971 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:39.493739 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:39.493971 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:39.493777 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:47.493762209 +0000 UTC m=+48.645858240 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:38:39.493971 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:39.493791 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:38:47.4937846 +0000 UTC m=+48.645880632 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:38:47.548287 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:47.548249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:38:47.548713 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:47.548294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:38:47.548713 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:47.548383 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:38:47.548713 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:47.548432 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:03.548418784 +0000 UTC m=+64.700514815 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:38:47.548713 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:47.548385 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:38:47.548713 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:38:47.548521 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:03.54850566 +0000 UTC m=+64.700601691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:38:57.641531 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:38:57.641505 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sj6sh" Apr 16 20:39:03.553656 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:03.553608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:39:03.554089 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:03.553667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:39:03.554089 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:03.553758 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:39:03.554089 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:03.553764 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:39:03.554089 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:03.553806 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:35.553793876 +0000 UTC m=+96.705889908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:39:03.554089 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:03.553879 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:39:35.553813966 +0000 UTC m=+96.705909996 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:39:04.157948 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.157913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:39:04.160183 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.160163 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:39:04.169093 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:04.169073 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:39:04.169181 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:04.169141 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:08.169119298 +0000 UTC m=+129.321215332 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : secret "metrics-daemon-secret" not found Apr 16 20:39:04.258486 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.258455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:39:04.261089 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.261072 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:39:04.270898 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.270877 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:39:04.282650 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.282627 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwsz5\" (UniqueName: \"kubernetes.io/projected/6492104e-0c2b-4f5b-bd8f-98d40e48a78e-kube-api-access-nwsz5\") pod \"network-check-target-84xkv\" (UID: \"6492104e-0c2b-4f5b-bd8f-98d40e48a78e\") " pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:39:04.338048 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.338031 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-484zf\"" Apr 16 20:39:04.345740 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.345718 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:39:04.507680 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.507652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-84xkv"] Apr 16 20:39:04.511059 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:39:04.511032 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6492104e_0c2b_4f5b_bd8f_98d40e48a78e.slice/crio-e66d1b253a5cb842a750fa85077a1abad8e98ce00def5911c404286b0945b9b0 WatchSource:0}: Error finding container e66d1b253a5cb842a750fa85077a1abad8e98ce00def5911c404286b0945b9b0: Status 404 returned error can't find the container with id e66d1b253a5cb842a750fa85077a1abad8e98ce00def5911c404286b0945b9b0 Apr 16 20:39:04.710354 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:04.710293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-84xkv" event={"ID":"6492104e-0c2b-4f5b-bd8f-98d40e48a78e","Type":"ContainerStarted","Data":"e66d1b253a5cb842a750fa85077a1abad8e98ce00def5911c404286b0945b9b0"} Apr 16 20:39:07.716184 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:07.716153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-84xkv" event={"ID":"6492104e-0c2b-4f5b-bd8f-98d40e48a78e","Type":"ContainerStarted","Data":"13143d6b71eb86f9068d898357ba3a5d349e25191c1e3d2f921cbb6a541cb320"} Apr 16 20:39:07.716451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:07.716265 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:39:07.734723 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:07.734685 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-84xkv" podStartSLOduration=65.646008314 podStartE2EDuration="1m8.734673653s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:39:04.51287189 +0000 UTC m=+65.664967921" lastFinishedPulling="2026-04-16 20:39:07.601537228 +0000 UTC m=+68.753633260" observedRunningTime="2026-04-16 20:39:07.734206888 +0000 UTC m=+68.886302945" watchObservedRunningTime="2026-04-16 20:39:07.734673653 +0000 UTC m=+68.886769706" Apr 16 20:39:35.566805 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:35.566691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:39:35.566805 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:35.566734 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:39:35.567266 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:35.566820 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:39:35.567266 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:35.566822 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:39:35.567266 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:35.566878 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls podName:aa099128-e7b3-453f-a700-69d4e48f8448 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:39.56686464 +0000 UTC m=+160.718960672 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls") pod "dns-default-7lm8k" (UID: "aa099128-e7b3-453f-a700-69d4e48f8448") : secret "dns-default-metrics-tls" not found Apr 16 20:39:35.567266 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:35.566891 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert podName:35692de4-3b87-4697-b519-4f55d1e81778 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:39.566885051 +0000 UTC m=+160.718981082 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert") pod "ingress-canary-f9kkl" (UID: "35692de4-3b87-4697-b519-4f55d1e81778") : secret "canary-serving-cert" not found Apr 16 20:39:38.720036 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:38.720008 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-84xkv" Apr 16 20:39:56.757505 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.757472 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq"] Apr 16 20:39:56.760160 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.760145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:56.763917 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.763892 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 20:39:56.765165 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.765146 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:39:56.765256 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.765146 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:39:56.765256 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.765234 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-x9dq6\"" Apr 16 20:39:56.766165 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.765971 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 20:39:56.768448 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.768429 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq"] Apr 16 20:39:56.858627 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.858595 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-x6rb4"] Apr 16 20:39:56.861337 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.861320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:56.863506 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.863483 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 20:39:56.863641 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.863608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:39:56.863723 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.863700 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xjj57\"" Apr 16 20:39:56.863934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.863918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 20:39:56.863999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.863923 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:39:56.868353 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.868322 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-x6rb4"] Apr 16 20:39:56.869099 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.869077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 20:39:56.906735 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.906708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxv99\" (UniqueName: \"kubernetes.io/projected/959a1948-886a-4bc5-bf25-72b0e2b30d8d-kube-api-access-hxv99\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:56.906844 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.906743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/959a1948-886a-4bc5-bf25-72b0e2b30d8d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:56.906844 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:56.906775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.007751 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxv99\" (UniqueName: \"kubernetes.io/projected/959a1948-886a-4bc5-bf25-72b0e2b30d8d-kube-api-access-hxv99\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.007751 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca797f13-b8b1-4f9e-8374-336cc1c934f4-serving-cert\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.007751 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007747 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/959a1948-886a-4bc5-bf25-72b0e2b30d8d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.007973 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007810 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ca797f13-b8b1-4f9e-8374-336cc1c934f4-snapshots\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.007973 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca797f13-b8b1-4f9e-8374-336cc1c934f4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.007973 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.007973 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007881 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmz2\" (UniqueName: \"kubernetes.io/projected/ca797f13-b8b1-4f9e-8374-336cc1c934f4-kube-api-access-sxmz2\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.008092 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:57.007986 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:39:57.008092 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.007982 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca797f13-b8b1-4f9e-8374-336cc1c934f4-service-ca-bundle\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.008092 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:57.008053 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls podName:959a1948-886a-4bc5-bf25-72b0e2b30d8d nodeName:}" failed. No retries permitted until 2026-04-16 20:39:57.508036556 +0000 UTC m=+118.660132593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vn7tq" (UID: "959a1948-886a-4bc5-bf25-72b0e2b30d8d") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:39:57.008183 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.008110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca797f13-b8b1-4f9e-8374-336cc1c934f4-tmp\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.009601 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.009583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/959a1948-886a-4bc5-bf25-72b0e2b30d8d-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.015464 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.015443 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxv99\" (UniqueName: \"kubernetes.io/projected/959a1948-886a-4bc5-bf25-72b0e2b30d8d-kube-api-access-hxv99\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.108666 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.108632 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ca797f13-b8b1-4f9e-8374-336cc1c934f4-snapshots\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.108827 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.108729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca797f13-b8b1-4f9e-8374-336cc1c934f4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.108827 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.108789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmz2\" (UniqueName: \"kubernetes.io/projected/ca797f13-b8b1-4f9e-8374-336cc1c934f4-kube-api-access-sxmz2\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.108934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.108831 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca797f13-b8b1-4f9e-8374-336cc1c934f4-service-ca-bundle\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.108934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.108886 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca797f13-b8b1-4f9e-8374-336cc1c934f4-tmp\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.108934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.108912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca797f13-b8b1-4f9e-8374-336cc1c934f4-serving-cert\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.109195 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.109179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ca797f13-b8b1-4f9e-8374-336cc1c934f4-tmp\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.109312 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.109295 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ca797f13-b8b1-4f9e-8374-336cc1c934f4-snapshots\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.109919 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.109898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca797f13-b8b1-4f9e-8374-336cc1c934f4-service-ca-bundle\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.110198 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.110178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca797f13-b8b1-4f9e-8374-336cc1c934f4-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.111562 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.111545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca797f13-b8b1-4f9e-8374-336cc1c934f4-serving-cert\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.116080 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.116059 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmz2\" (UniqueName: \"kubernetes.io/projected/ca797f13-b8b1-4f9e-8374-336cc1c934f4-kube-api-access-sxmz2\") pod \"insights-operator-585dfdc468-x6rb4\" (UID: \"ca797f13-b8b1-4f9e-8374-336cc1c934f4\") " pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.170977 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.170955 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" Apr 16 20:39:57.295951 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.295832 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-x6rb4"] Apr 16 20:39:57.298269 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:39:57.298242 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca797f13_b8b1_4f9e_8374_336cc1c934f4.slice/crio-bcf69e74b95c7b613b1fe6983e403b0282aed139ac0e111e502c1845cbd7f6ad WatchSource:0}: Error finding container bcf69e74b95c7b613b1fe6983e403b0282aed139ac0e111e502c1845cbd7f6ad: Status 404 returned error can't find the container with id bcf69e74b95c7b613b1fe6983e403b0282aed139ac0e111e502c1845cbd7f6ad Apr 16 20:39:57.512358 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.512318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:57.512513 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:57.512440 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:39:57.512513 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:57.512494 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls podName:959a1948-886a-4bc5-bf25-72b0e2b30d8d nodeName:}" failed. No retries permitted until 2026-04-16 20:39:58.512480608 +0000 UTC m=+119.664576639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vn7tq" (UID: "959a1948-886a-4bc5-bf25-72b0e2b30d8d") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:39:57.806852 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:57.806812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" event={"ID":"ca797f13-b8b1-4f9e-8374-336cc1c934f4","Type":"ContainerStarted","Data":"bcf69e74b95c7b613b1fe6983e403b0282aed139ac0e111e502c1845cbd7f6ad"} Apr 16 20:39:58.519732 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:58.519688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:39:58.519922 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:58.519849 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:39:58.519988 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:39:58.519928 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls podName:959a1948-886a-4bc5-bf25-72b0e2b30d8d nodeName:}" failed. No retries permitted until 2026-04-16 20:40:00.519911539 +0000 UTC m=+121.672007574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vn7tq" (UID: "959a1948-886a-4bc5-bf25-72b0e2b30d8d") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:39:59.811845 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:59.811804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" event={"ID":"ca797f13-b8b1-4f9e-8374-336cc1c934f4","Type":"ContainerStarted","Data":"83c3d141ebc0a0b79bbfab4ab9d09ff7a1cf58c0ee755ee2dc31e3284a32da88"} Apr 16 20:39:59.827403 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:39:59.827355 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" podStartSLOduration=1.442812947 podStartE2EDuration="3.827339106s" podCreationTimestamp="2026-04-16 20:39:56 +0000 UTC" firstStartedPulling="2026-04-16 20:39:57.29992315 +0000 UTC m=+118.452019185" lastFinishedPulling="2026-04-16 20:39:59.684449308 +0000 UTC m=+120.836545344" observedRunningTime="2026-04-16 20:39:59.826502949 +0000 UTC m=+120.978599003" watchObservedRunningTime="2026-04-16 20:39:59.827339106 +0000 UTC m=+120.979435160" Apr 16 20:40:00.532834 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:00.532793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:40:00.533024 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:00.532937 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:40:00.533024 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:00.532999 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls podName:959a1948-886a-4bc5-bf25-72b0e2b30d8d nodeName:}" failed. No retries permitted until 2026-04-16 20:40:04.532982709 +0000 UTC m=+125.685078753 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vn7tq" (UID: "959a1948-886a-4bc5-bf25-72b0e2b30d8d") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:40:02.300700 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.300663 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m"] Apr 16 20:40:02.303340 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.303324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" Apr 16 20:40:02.305516 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.305497 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-v4kn7\"" Apr 16 20:40:02.309562 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.309235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m"] Apr 16 20:40:02.445019 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.444979 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhvmv\" (UniqueName: \"kubernetes.io/projected/3d2801ad-c31b-4d08-8d50-b4fcf09686ce-kube-api-access-jhvmv\") pod \"network-check-source-8894fc9bd-h5z7m\" (UID: \"3d2801ad-c31b-4d08-8d50-b4fcf09686ce\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" Apr 16 20:40:02.546053 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.546005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhvmv\" (UniqueName: \"kubernetes.io/projected/3d2801ad-c31b-4d08-8d50-b4fcf09686ce-kube-api-access-jhvmv\") pod \"network-check-source-8894fc9bd-h5z7m\" (UID: \"3d2801ad-c31b-4d08-8d50-b4fcf09686ce\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" Apr 16 20:40:02.554259 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.554199 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhvmv\" (UniqueName: \"kubernetes.io/projected/3d2801ad-c31b-4d08-8d50-b4fcf09686ce-kube-api-access-jhvmv\") pod \"network-check-source-8894fc9bd-h5z7m\" (UID: \"3d2801ad-c31b-4d08-8d50-b4fcf09686ce\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" Apr 16 20:40:02.612120 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.612095 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" Apr 16 20:40:02.721165 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.721131 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m"] Apr 16 20:40:02.723958 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:02.723928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d2801ad_c31b_4d08_8d50_b4fcf09686ce.slice/crio-e30c12b5c57a789743f6b5b964f3f4b94ccae6296338f4473de38ae1df9ac494 WatchSource:0}: Error finding container e30c12b5c57a789743f6b5b964f3f4b94ccae6296338f4473de38ae1df9ac494: Status 404 returned error can't find the container with id e30c12b5c57a789743f6b5b964f3f4b94ccae6296338f4473de38ae1df9ac494 Apr 16 20:40:02.818210 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.818135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" event={"ID":"3d2801ad-c31b-4d08-8d50-b4fcf09686ce","Type":"ContainerStarted","Data":"3c61f52e5129d5968511b5023f23de22e18e0421665d15415f9706d3bac26a86"} Apr 16 20:40:02.818210 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.818173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" event={"ID":"3d2801ad-c31b-4d08-8d50-b4fcf09686ce","Type":"ContainerStarted","Data":"e30c12b5c57a789743f6b5b964f3f4b94ccae6296338f4473de38ae1df9ac494"} Apr 16 20:40:02.833149 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:02.833112 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-h5z7m" podStartSLOduration=0.833098598 podStartE2EDuration="833.098598ms" podCreationTimestamp="2026-04-16 20:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:40:02.831829866 +0000 UTC m=+123.983925911" watchObservedRunningTime="2026-04-16 20:40:02.833098598 +0000 UTC m=+123.985194651" Apr 16 20:40:03.447268 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.447245 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jq26v_838cdbbd-45af-4493-a167-65bd220c03c8/dns-node-resolver/0.log" Apr 16 20:40:03.628647 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.628600 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp"] Apr 16 20:40:03.631592 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.631576 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" Apr 16 20:40:03.634141 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.634122 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 20:40:03.634955 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.634941 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2mkpg\"" Apr 16 20:40:03.635006 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.634949 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 20:40:03.640482 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.640462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp"] Apr 16 20:40:03.754445 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.754374 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx4gk\" (UniqueName: \"kubernetes.io/projected/01c82200-8490-468e-a926-734a11ac86ca-kube-api-access-mx4gk\") pod \"migrator-74bb7799d9-vrpqp\" (UID: \"01c82200-8490-468e-a926-734a11ac86ca\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" Apr 16 20:40:03.855498 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.855468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx4gk\" (UniqueName: \"kubernetes.io/projected/01c82200-8490-468e-a926-734a11ac86ca-kube-api-access-mx4gk\") pod \"migrator-74bb7799d9-vrpqp\" (UID: \"01c82200-8490-468e-a926-734a11ac86ca\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" Apr 16 20:40:03.862755 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.862730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx4gk\" (UniqueName: \"kubernetes.io/projected/01c82200-8490-468e-a926-734a11ac86ca-kube-api-access-mx4gk\") pod \"migrator-74bb7799d9-vrpqp\" (UID: \"01c82200-8490-468e-a926-734a11ac86ca\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" Apr 16 20:40:03.939666 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:03.939642 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" Apr 16 20:40:04.049157 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:04.049129 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp"] Apr 16 20:40:04.049436 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:04.049418 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5z2mk_c00ff0b8-9f9c-418d-854c-b22bc6be761f/node-ca/0.log" Apr 16 20:40:04.051626 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:04.051591 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c82200_8490_468e_a926_734a11ac86ca.slice/crio-ea0e452e03c399db7056a57d79c1ba91900cdc98f2870dbb31d8c43604ae2b85 WatchSource:0}: Error finding container ea0e452e03c399db7056a57d79c1ba91900cdc98f2870dbb31d8c43604ae2b85: Status 404 returned error can't find the container with id ea0e452e03c399db7056a57d79c1ba91900cdc98f2870dbb31d8c43604ae2b85 Apr 16 20:40:04.560557 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:04.560523 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:40:04.560959 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:04.560670 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:40:04.560959 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:04.560725 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls podName:959a1948-886a-4bc5-bf25-72b0e2b30d8d nodeName:}" failed. No retries permitted until 2026-04-16 20:40:12.560712025 +0000 UTC m=+133.712808057 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vn7tq" (UID: "959a1948-886a-4bc5-bf25-72b0e2b30d8d") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:40:04.823040 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:04.822956 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" event={"ID":"01c82200-8490-468e-a926-734a11ac86ca","Type":"ContainerStarted","Data":"ea0e452e03c399db7056a57d79c1ba91900cdc98f2870dbb31d8c43604ae2b85"} Apr 16 20:40:05.827067 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:05.826991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" event={"ID":"01c82200-8490-468e-a926-734a11ac86ca","Type":"ContainerStarted","Data":"574fb48fa580433d345ecc0cc9deed8c8011633b35451df96e41c9c986dab25f"} Apr 16 20:40:05.827067 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:05.827032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" event={"ID":"01c82200-8490-468e-a926-734a11ac86ca","Type":"ContainerStarted","Data":"f0c36dc0d78943cbc1affdf6ee6596a02bc8c0fe67f1c5e6d65a21f3dba0bf9c"} Apr 16 20:40:05.841048 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:05.841007 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-vrpqp" podStartSLOduration=1.359541605 podStartE2EDuration="2.8409942s" podCreationTimestamp="2026-04-16 20:40:03 +0000 UTC" firstStartedPulling="2026-04-16 20:40:04.056485156 +0000 UTC m=+125.208581190" lastFinishedPulling="2026-04-16 20:40:05.537937743 +0000 UTC m=+126.690033785" observedRunningTime="2026-04-16 20:40:05.840151306 +0000 UTC m=+126.992247360" watchObservedRunningTime="2026-04-16 20:40:05.8409942 +0000 UTC m=+126.993090252" Apr 16 20:40:07.786862 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.786828 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dwwlr"] Apr 16 20:40:07.789666 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.789650 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.792216 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.792184 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 20:40:07.792343 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.792296 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 20:40:07.793215 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.793191 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 20:40:07.793215 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.793215 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 20:40:07.793369 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.793258 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-z79hm\"" Apr 16 20:40:07.795742 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.795721 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dwwlr"] Apr 16 20:40:07.883388 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.883357 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e12f7fba-4ffe-436e-9285-385519213492-signing-key\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.883592 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.883402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e12f7fba-4ffe-436e-9285-385519213492-signing-cabundle\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.883592 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.883503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6h94\" (UniqueName: \"kubernetes.io/projected/e12f7fba-4ffe-436e-9285-385519213492-kube-api-access-h6h94\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.984205 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.984177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6h94\" (UniqueName: \"kubernetes.io/projected/e12f7fba-4ffe-436e-9285-385519213492-kube-api-access-h6h94\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.984316 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.984238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e12f7fba-4ffe-436e-9285-385519213492-signing-key\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.984316 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.984261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e12f7fba-4ffe-436e-9285-385519213492-signing-cabundle\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.984826 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.984809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e12f7fba-4ffe-436e-9285-385519213492-signing-cabundle\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.986510 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.986494 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e12f7fba-4ffe-436e-9285-385519213492-signing-key\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:07.992450 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:07.992425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6h94\" (UniqueName: \"kubernetes.io/projected/e12f7fba-4ffe-436e-9285-385519213492-kube-api-access-h6h94\") pod \"service-ca-865cb79987-dwwlr\" (UID: \"e12f7fba-4ffe-436e-9285-385519213492\") " pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:08.098688 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:08.098664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-dwwlr" Apr 16 20:40:08.185184 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:08.185149 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:40:08.185354 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:08.185329 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:40:08.186004 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:08.185418 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs podName:422c9f50-4f45-46bc-9e9d-5c4f1c20c115 nodeName:}" failed. No retries permitted until 2026-04-16 20:42:10.185394335 +0000 UTC m=+251.337490373 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs") pod "network-metrics-daemon-5jhhm" (UID: "422c9f50-4f45-46bc-9e9d-5c4f1c20c115") : secret "metrics-daemon-secret" not found Apr 16 20:40:08.213503 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:08.213471 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-dwwlr"] Apr 16 20:40:08.216953 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:08.216924 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode12f7fba_4ffe_436e_9285_385519213492.slice/crio-2101c488ddee24798413c10b3f2186c4bc52568b70dcd4d00581ad6600811540 WatchSource:0}: Error finding container 2101c488ddee24798413c10b3f2186c4bc52568b70dcd4d00581ad6600811540: Status 404 returned error can't find the container with id 2101c488ddee24798413c10b3f2186c4bc52568b70dcd4d00581ad6600811540 Apr 16 20:40:08.837641 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:08.837556 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dwwlr" event={"ID":"e12f7fba-4ffe-436e-9285-385519213492","Type":"ContainerStarted","Data":"2101c488ddee24798413c10b3f2186c4bc52568b70dcd4d00581ad6600811540"} Apr 16 20:40:10.844719 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:10.844688 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-dwwlr" event={"ID":"e12f7fba-4ffe-436e-9285-385519213492","Type":"ContainerStarted","Data":"171656688c355868bcfcbeba6f29390f85958e563616ed595c565609944e9d3a"} Apr 16 20:40:10.860230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:10.860187 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-dwwlr" podStartSLOduration=2.232364727 podStartE2EDuration="3.860172939s" podCreationTimestamp="2026-04-16 20:40:07 +0000 UTC" firstStartedPulling="2026-04-16 20:40:08.218743511 +0000 UTC m=+129.370839543" lastFinishedPulling="2026-04-16 20:40:09.846551725 +0000 UTC m=+130.998647755" observedRunningTime="2026-04-16 20:40:10.859473702 +0000 UTC m=+132.011569753" watchObservedRunningTime="2026-04-16 20:40:10.860172939 +0000 UTC m=+132.012268991" Apr 16 20:40:12.622999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:12.622964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:40:12.623373 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:12.623075 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 20:40:12.623373 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:12.623137 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls podName:959a1948-886a-4bc5-bf25-72b0e2b30d8d nodeName:}" failed. No retries permitted until 2026-04-16 20:40:28.623122524 +0000 UTC m=+149.775218556 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-vn7tq" (UID: "959a1948-886a-4bc5-bf25-72b0e2b30d8d") : secret "cluster-monitoring-operator-tls" not found Apr 16 20:40:28.638040 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:28.637985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:40:28.640457 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:28.640431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/959a1948-886a-4bc5-bf25-72b0e2b30d8d-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-vn7tq\" (UID: \"959a1948-886a-4bc5-bf25-72b0e2b30d8d\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:40:28.871041 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:28.871009 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-x9dq6\"" Apr 16 20:40:28.878885 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:28.878864 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" Apr 16 20:40:28.997183 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:28.997163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq"] Apr 16 20:40:28.999472 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:28.999446 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959a1948_886a_4bc5_bf25_72b0e2b30d8d.slice/crio-35a756ee66c497ec9ddc4b09a6e9250015152b58f0d9c6910adc12b0a880569c WatchSource:0}: Error finding container 35a756ee66c497ec9ddc4b09a6e9250015152b58f0d9c6910adc12b0a880569c: Status 404 returned error can't find the container with id 35a756ee66c497ec9ddc4b09a6e9250015152b58f0d9c6910adc12b0a880569c Apr 16 20:40:29.897859 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:29.897820 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" event={"ID":"959a1948-886a-4bc5-bf25-72b0e2b30d8d","Type":"ContainerStarted","Data":"35a756ee66c497ec9ddc4b09a6e9250015152b58f0d9c6910adc12b0a880569c"} Apr 16 20:40:30.140748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.140714 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s5xct"] Apr 16 20:40:30.144096 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.144079 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.147631 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.147592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-khb4d\"" Apr 16 20:40:30.147760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.147728 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:40:30.148536 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.148498 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:40:30.171561 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.171540 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s5xct"] Apr 16 20:40:30.216789 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.216761 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58497df7cd-rdfqk"] Apr 16 20:40:30.219928 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.219908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.223222 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.223202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:40:30.223468 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.223454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-66wkg\"" Apr 16 20:40:30.223570 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.223553 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:40:30.223848 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.223833 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:40:30.229980 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.229960 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:40:30.237847 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.237827 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58497df7cd-rdfqk"] Apr 16 20:40:30.250265 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.250248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3cf8644-3194-4619-b219-a5991ba494bc-data-volume\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.250351 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.250285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3cf8644-3194-4619-b219-a5991ba494bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.250351 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.250334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjn5r\" (UniqueName: \"kubernetes.io/projected/a3cf8644-3194-4619-b219-a5991ba494bc-kube-api-access-gjn5r\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.250416 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.250375 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3cf8644-3194-4619-b219-a5991ba494bc-crio-socket\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.250416 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.250400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3cf8644-3194-4619-b219-a5991ba494bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351125 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjn5r\" (UniqueName: \"kubernetes.io/projected/a3cf8644-3194-4619-b219-a5991ba494bc-kube-api-access-gjn5r\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351279 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-registry-tls\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351279 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351212 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/05edfaa7-98a2-4fd0-9840-27fc19cabc02-installation-pull-secrets\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351279 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05edfaa7-98a2-4fd0-9840-27fc19cabc02-trusted-ca\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351279 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351270 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8srcd\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-kube-api-access-8srcd\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351507 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351376 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/05edfaa7-98a2-4fd0-9840-27fc19cabc02-ca-trust-extracted\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351507 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351429 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3cf8644-3194-4619-b219-a5991ba494bc-crio-socket\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351507 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3cf8644-3194-4619-b219-a5991ba494bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351525 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/05edfaa7-98a2-4fd0-9840-27fc19cabc02-registry-certificates\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a3cf8644-3194-4619-b219-a5991ba494bc-crio-socket\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3cf8644-3194-4619-b219-a5991ba494bc-data-volume\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-bound-sa-token\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351839 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3cf8644-3194-4619-b219-a5991ba494bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.351839 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/05edfaa7-98a2-4fd0-9840-27fc19cabc02-image-registry-private-configuration\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.351940 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.351904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a3cf8644-3194-4619-b219-a5991ba494bc-data-volume\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.352065 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.352040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a3cf8644-3194-4619-b219-a5991ba494bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.354269 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.354248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a3cf8644-3194-4619-b219-a5991ba494bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.365120 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.365095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjn5r\" (UniqueName: \"kubernetes.io/projected/a3cf8644-3194-4619-b219-a5991ba494bc-kube-api-access-gjn5r\") pod \"insights-runtime-extractor-s5xct\" (UID: \"a3cf8644-3194-4619-b219-a5991ba494bc\") " pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.452678 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.452593 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s5xct" Apr 16 20:40:30.452825 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.452702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/05edfaa7-98a2-4fd0-9840-27fc19cabc02-ca-trust-extracted\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.452825 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.452757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/05edfaa7-98a2-4fd0-9840-27fc19cabc02-registry-certificates\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.452825 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.452786 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-bound-sa-token\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.452985 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.452823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/05edfaa7-98a2-4fd0-9840-27fc19cabc02-image-registry-private-configuration\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453039 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-registry-tls\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453091 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/05edfaa7-98a2-4fd0-9840-27fc19cabc02-installation-pull-secrets\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453091 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05edfaa7-98a2-4fd0-9840-27fc19cabc02-trusted-ca\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453186 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453092 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8srcd\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-kube-api-access-8srcd\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453186 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453116 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/05edfaa7-98a2-4fd0-9840-27fc19cabc02-ca-trust-extracted\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453865 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453842 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/05edfaa7-98a2-4fd0-9840-27fc19cabc02-registry-certificates\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.453986 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.453961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05edfaa7-98a2-4fd0-9840-27fc19cabc02-trusted-ca\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.455702 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.455680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/05edfaa7-98a2-4fd0-9840-27fc19cabc02-image-registry-private-configuration\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.455822 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.455807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/05edfaa7-98a2-4fd0-9840-27fc19cabc02-installation-pull-secrets\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.456016 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.455992 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-registry-tls\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.466691 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.466668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-bound-sa-token\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.466983 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.466967 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8srcd\" (UniqueName: \"kubernetes.io/projected/05edfaa7-98a2-4fd0-9840-27fc19cabc02-kube-api-access-8srcd\") pod \"image-registry-58497df7cd-rdfqk\" (UID: \"05edfaa7-98a2-4fd0-9840-27fc19cabc02\") " pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.528312 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.527801 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:30.612152 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.612121 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s5xct"] Apr 16 20:40:30.681501 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.681472 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58497df7cd-rdfqk"] Apr 16 20:40:30.901478 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:30.901435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5xct" event={"ID":"a3cf8644-3194-4619-b219-a5991ba494bc","Type":"ContainerStarted","Data":"7050e2fda862ef312f0d785f094d8e61f83f8bdcda058bd9b782cf3151333a91"} Apr 16 20:40:31.079790 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:31.079754 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05edfaa7_98a2_4fd0_9840_27fc19cabc02.slice/crio-cc692ea67021f3283dc0cda2bd18e5e1653fc8dd1ef08c6c3f48de8c573d139b WatchSource:0}: Error finding container cc692ea67021f3283dc0cda2bd18e5e1653fc8dd1ef08c6c3f48de8c573d139b: Status 404 returned error can't find the container with id cc692ea67021f3283dc0cda2bd18e5e1653fc8dd1ef08c6c3f48de8c573d139b Apr 16 20:40:31.905543 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.905504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5xct" event={"ID":"a3cf8644-3194-4619-b219-a5991ba494bc","Type":"ContainerStarted","Data":"b822d9b4547fc9b47446a3a29fd662f31b195361748b47bc6ec049b39adff02a"} Apr 16 20:40:31.905543 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.905545 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5xct" event={"ID":"a3cf8644-3194-4619-b219-a5991ba494bc","Type":"ContainerStarted","Data":"a6968d4814f68c38877e953e795f973989fb4d858a317876c8f097d3195f80c1"} Apr 16 20:40:31.906741 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.906717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" event={"ID":"05edfaa7-98a2-4fd0-9840-27fc19cabc02","Type":"ContainerStarted","Data":"7116e3bd65e2635297349f2bedcea2f1d1bde36568d8f16c0bbe259b1fb18a0e"} Apr 16 20:40:31.906741 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.906746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" event={"ID":"05edfaa7-98a2-4fd0-9840-27fc19cabc02","Type":"ContainerStarted","Data":"cc692ea67021f3283dc0cda2bd18e5e1653fc8dd1ef08c6c3f48de8c573d139b"} Apr 16 20:40:31.906901 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.906787 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:31.907854 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.907833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" event={"ID":"959a1948-886a-4bc5-bf25-72b0e2b30d8d","Type":"ContainerStarted","Data":"53134cc1c5cde2ab371e389dbfad0f27c8c0f66d6553d2f75a5d8d15debfaa8c"} Apr 16 20:40:31.930559 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.930490 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" podStartSLOduration=1.930478685 podStartE2EDuration="1.930478685s" podCreationTimestamp="2026-04-16 20:40:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:40:31.929746097 +0000 UTC m=+153.081842150" watchObservedRunningTime="2026-04-16 20:40:31.930478685 +0000 UTC m=+153.082574737" Apr 16 20:40:31.947836 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:31.947799 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-vn7tq" podStartSLOduration=33.81729027 podStartE2EDuration="35.947786817s" podCreationTimestamp="2026-04-16 20:39:56 +0000 UTC" firstStartedPulling="2026-04-16 20:40:29.001199908 +0000 UTC m=+150.153295939" lastFinishedPulling="2026-04-16 20:40:31.131696454 +0000 UTC m=+152.283792486" observedRunningTime="2026-04-16 20:40:31.947195566 +0000 UTC m=+153.099291630" watchObservedRunningTime="2026-04-16 20:40:31.947786817 +0000 UTC m=+153.099882897" Apr 16 20:40:33.914899 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:33.914862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s5xct" event={"ID":"a3cf8644-3194-4619-b219-a5991ba494bc","Type":"ContainerStarted","Data":"a4c0b23e80b9a1479682fc493915287ef0325852dcfc817725b41c12cc326b0c"} Apr 16 20:40:33.942481 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:33.942438 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s5xct" podStartSLOduration=2.086633719 podStartE2EDuration="3.942425258s" podCreationTimestamp="2026-04-16 20:40:30 +0000 UTC" firstStartedPulling="2026-04-16 20:40:31.12288421 +0000 UTC m=+152.274980241" lastFinishedPulling="2026-04-16 20:40:32.978675748 +0000 UTC m=+154.130771780" observedRunningTime="2026-04-16 20:40:33.941044987 +0000 UTC m=+155.093141065" watchObservedRunningTime="2026-04-16 20:40:33.942425258 +0000 UTC m=+155.094521310" Apr 16 20:40:34.747691 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:34.747652 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7lm8k" podUID="aa099128-e7b3-453f-a700-69d4e48f8448" Apr 16 20:40:34.758775 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:34.758746 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-f9kkl" podUID="35692de4-3b87-4697-b519-4f55d1e81778" Apr 16 20:40:34.920273 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:34.919064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7lm8k" Apr 16 20:40:36.440345 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:36.440302 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-5jhhm" podUID="422c9f50-4f45-46bc-9e9d-5c4f1c20c115" Apr 16 20:40:39.619607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.619571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:40:39.620002 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.619645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:40:39.621947 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.621921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aa099128-e7b3-453f-a700-69d4e48f8448-metrics-tls\") pod \"dns-default-7lm8k\" (UID: \"aa099128-e7b3-453f-a700-69d4e48f8448\") " pod="openshift-dns/dns-default-7lm8k" Apr 16 20:40:39.622055 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.622032 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35692de4-3b87-4697-b519-4f55d1e81778-cert\") pod \"ingress-canary-f9kkl\" (UID: \"35692de4-3b87-4697-b519-4f55d1e81778\") " pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:40:39.723047 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.723022 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wwh4f\"" Apr 16 20:40:39.731258 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.731236 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7lm8k" Apr 16 20:40:39.849932 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.849898 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7lm8k"] Apr 16 20:40:39.852760 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:39.852735 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa099128_e7b3_453f_a700_69d4e48f8448.slice/crio-a9422d452e72c182b480438f724ac31b0aa65fb6940c6ffa9a0dcbeca62dc50b WatchSource:0}: Error finding container a9422d452e72c182b480438f724ac31b0aa65fb6940c6ffa9a0dcbeca62dc50b: Status 404 returned error can't find the container with id a9422d452e72c182b480438f724ac31b0aa65fb6940c6ffa9a0dcbeca62dc50b Apr 16 20:40:39.930754 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:39.930692 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7lm8k" event={"ID":"aa099128-e7b3-453f-a700-69d4e48f8448","Type":"ContainerStarted","Data":"a9422d452e72c182b480438f724ac31b0aa65fb6940c6ffa9a0dcbeca62dc50b"} Apr 16 20:40:41.154970 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.154944 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cfgsz"] Apr 16 20:40:41.165567 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.165536 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.169632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.168653 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:40:41.169632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.168987 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:40:41.169632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.169179 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fn5z7\"" Apr 16 20:40:41.169632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.169331 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:40:41.169632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.169492 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:40:41.232105 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232080 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-root\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-textfile\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-accelerators-collector-config\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232387 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-wtmp\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232387 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-metrics-client-ca\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232387 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-sys\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232387 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-tls\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.232387 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.232366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxj2p\" (UniqueName: \"kubernetes.io/projected/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-kube-api-access-sxj2p\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333662 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-root\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-textfile\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-accelerators-collector-config\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333811 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-wtmp\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-metrics-client-ca\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-sys\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333875 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-root\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334012 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:41.333997 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:40:41.334574 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:40:41.334056 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-tls podName:6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0 nodeName:}" failed. No retries permitted until 2026-04-16 20:40:41.834036694 +0000 UTC m=+162.986132742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-tls") pod "node-exporter-cfgsz" (UID: "6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0") : secret "node-exporter-tls" not found Apr 16 20:40:41.334574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.334105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-textfile\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.333890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-tls\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.334170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxj2p\" (UniqueName: \"kubernetes.io/projected/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-kube-api-access-sxj2p\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.334245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-wtmp\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.334574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.334305 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-sys\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.335073 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.335031 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-accelerators-collector-config\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.335073 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.335046 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-metrics-client-ca\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.337838 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.337815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.349183 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.349138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxj2p\" (UniqueName: \"kubernetes.io/projected/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-kube-api-access-sxj2p\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.442113 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.442087 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-wvhq9"] Apr 16 20:40:41.444971 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.444954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:40:41.450046 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.450025 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:40:41.450192 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.450029 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-t8chc\"" Apr 16 20:40:41.450292 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.450277 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:40:41.461846 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.461825 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wvhq9"] Apr 16 20:40:41.535460 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.535440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvbwk\" (UniqueName: \"kubernetes.io/projected/4d826043-195a-403c-a1c0-b18a4ddf86fa-kube-api-access-nvbwk\") pod \"downloads-6bcc868b7-wvhq9\" (UID: \"4d826043-195a-403c-a1c0-b18a4ddf86fa\") " pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:40:41.636858 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.636808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvbwk\" (UniqueName: \"kubernetes.io/projected/4d826043-195a-403c-a1c0-b18a4ddf86fa-kube-api-access-nvbwk\") pod \"downloads-6bcc868b7-wvhq9\" (UID: \"4d826043-195a-403c-a1c0-b18a4ddf86fa\") " pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:40:41.659120 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.659043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvbwk\" (UniqueName: \"kubernetes.io/projected/4d826043-195a-403c-a1c0-b18a4ddf86fa-kube-api-access-nvbwk\") pod \"downloads-6bcc868b7-wvhq9\" (UID: \"4d826043-195a-403c-a1c0-b18a4ddf86fa\") " pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:40:41.753074 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.753045 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:40:41.838135 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.838103 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-tls\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.840344 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.840317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0-node-exporter-tls\") pod \"node-exporter-cfgsz\" (UID: \"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0\") " pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:41.914899 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.914876 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-wvhq9"] Apr 16 20:40:41.917291 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:41.917247 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d826043_195a_403c_a1c0_b18a4ddf86fa.slice/crio-275a27b2db85508739c486ba786bde7f90b4a31a6c28e8aa832413ef96949cb8 WatchSource:0}: Error finding container 275a27b2db85508739c486ba786bde7f90b4a31a6c28e8aa832413ef96949cb8: Status 404 returned error can't find the container with id 275a27b2db85508739c486ba786bde7f90b4a31a6c28e8aa832413ef96949cb8 Apr 16 20:40:41.936830 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.936794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7lm8k" event={"ID":"aa099128-e7b3-453f-a700-69d4e48f8448","Type":"ContainerStarted","Data":"18faceae73c673b4d219510e13b133a52b51d1b15279b9e768d2670ee9bac202"} Apr 16 20:40:41.936947 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.936833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7lm8k" event={"ID":"aa099128-e7b3-453f-a700-69d4e48f8448","Type":"ContainerStarted","Data":"7ec7744db6a368833b1131ca0a072e5d88ca7c77e448b5f9b5a4d7dfe4e1fa0b"} Apr 16 20:40:41.937039 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.937018 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7lm8k" Apr 16 20:40:41.937808 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.937787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wvhq9" event={"ID":"4d826043-195a-403c-a1c0-b18a4ddf86fa","Type":"ContainerStarted","Data":"275a27b2db85508739c486ba786bde7f90b4a31a6c28e8aa832413ef96949cb8"} Apr 16 20:40:41.973198 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:41.973159 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7lm8k" podStartSLOduration=129.689396132 podStartE2EDuration="2m10.973148686s" podCreationTimestamp="2026-04-16 20:38:31 +0000 UTC" firstStartedPulling="2026-04-16 20:40:39.854431889 +0000 UTC m=+161.006527920" lastFinishedPulling="2026-04-16 20:40:41.138184428 +0000 UTC m=+162.290280474" observedRunningTime="2026-04-16 20:40:41.970596509 +0000 UTC m=+163.122692564" watchObservedRunningTime="2026-04-16 20:40:41.973148686 +0000 UTC m=+163.125244738" Apr 16 20:40:42.080370 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.080347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cfgsz" Apr 16 20:40:42.087813 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:42.087785 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e75c8c4_70f1_4a2c_ae99_b103bc04e1b0.slice/crio-82ff0055aa41a7357e0e864dc4af416b54cb358b9312b1c1eb573ce69e81df5f WatchSource:0}: Error finding container 82ff0055aa41a7357e0e864dc4af416b54cb358b9312b1c1eb573ce69e81df5f: Status 404 returned error can't find the container with id 82ff0055aa41a7357e0e864dc4af416b54cb358b9312b1c1eb573ce69e81df5f Apr 16 20:40:42.461494 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.461462 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:40:42.465114 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.465090 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.468960 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.468937 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:40:42.469098 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469000 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:40:42.469098 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469004 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-wzssz\"" Apr 16 20:40:42.469218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.468945 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:40:42.469218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469147 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:40:42.469218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469209 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:40:42.469359 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:40:42.469529 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:40:42.469745 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:40:42.469915 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.469893 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:40:42.504549 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.504509 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:40:42.543902 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.543841 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.543902 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.543901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-config-volume\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544137 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.543930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544137 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.543960 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544137 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544008 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544137 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-tls-assets\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544137 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544123 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-config-out\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkc4\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-kube-api-access-8kkc4\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544195 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544227 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544304 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-web-config\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.544334 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.544329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-web-config\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-config-volume\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645445 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-tls-assets\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-config-out\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkc4\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-kube-api-access-8kkc4\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.645791 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.645698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.646542 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.646202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.647254 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.647220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.647608 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.647585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.651148 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.651105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.651514 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.651468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.653732 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.653578 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-config-volume\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.654355 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.654178 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.654835 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.654815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.655208 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.655180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-config-out\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.655726 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.655570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.655726 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.655687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-web-config\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.655943 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.655904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-tls-assets\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.663907 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.663856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkc4\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-kube-api-access-8kkc4\") pod \"alertmanager-main-0\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.777389 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.777286 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:40:42.949903 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.949869 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cfgsz" event={"ID":"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0","Type":"ContainerStarted","Data":"82ff0055aa41a7357e0e864dc4af416b54cb358b9312b1c1eb573ce69e81df5f"} Apr 16 20:40:42.973404 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:42.973381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:40:42.979235 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:42.979209 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228c50a5_2d6a_477d_ade1_7022cad32554.slice/crio-2ae86bb612fa7410ba0681e5ebe75d3afe033c362a3215eaf8bef6b45799b773 WatchSource:0}: Error finding container 2ae86bb612fa7410ba0681e5ebe75d3afe033c362a3215eaf8bef6b45799b773: Status 404 returned error can't find the container with id 2ae86bb612fa7410ba0681e5ebe75d3afe033c362a3215eaf8bef6b45799b773 Apr 16 20:40:43.954713 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:43.954675 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0" containerID="d9222d71aaf75b2bc4a5dfb20204ffb7d49a65af51de1fd58f1310696027f299" exitCode=0 Apr 16 20:40:43.955150 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:43.954773 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cfgsz" event={"ID":"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0","Type":"ContainerDied","Data":"d9222d71aaf75b2bc4a5dfb20204ffb7d49a65af51de1fd58f1310696027f299"} Apr 16 20:40:43.956025 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:43.955989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"2ae86bb612fa7410ba0681e5ebe75d3afe033c362a3215eaf8bef6b45799b773"} Apr 16 20:40:44.961343 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:44.961302 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cfgsz" event={"ID":"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0","Type":"ContainerStarted","Data":"3a806143b60d284a87d5cb8d63fe1fa6da4f44bbef0dde013271cc7d728d180f"} Apr 16 20:40:44.961343 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:44.961345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cfgsz" event={"ID":"6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0","Type":"ContainerStarted","Data":"0f44983e193f37f17f11893b406616f35a817d2825bf280d34ee22d3acf09ead"} Apr 16 20:40:44.962695 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:44.962663 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d" exitCode=0 Apr 16 20:40:44.962836 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:44.962724 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d"} Apr 16 20:40:44.984653 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:44.984595 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cfgsz" podStartSLOduration=3.209835009 podStartE2EDuration="3.984579209s" podCreationTimestamp="2026-04-16 20:40:41 +0000 UTC" firstStartedPulling="2026-04-16 20:40:42.089368168 +0000 UTC m=+163.241464199" lastFinishedPulling="2026-04-16 20:40:42.864112361 +0000 UTC m=+164.016208399" observedRunningTime="2026-04-16 20:40:44.982545097 +0000 UTC m=+166.134641174" watchObservedRunningTime="2026-04-16 20:40:44.984579209 +0000 UTC m=+166.136675262" Apr 16 20:40:46.971936 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:46.971900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea"} Apr 16 20:40:46.972277 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:46.971947 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be"} Apr 16 20:40:46.972277 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:46.971959 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed"} Apr 16 20:40:46.972277 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:46.971967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a"} Apr 16 20:40:46.972277 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:46.971976 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9"} Apr 16 20:40:47.320060 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.319976 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5dd8686788-v465s"] Apr 16 20:40:47.323058 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.323036 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.326465 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.326388 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:40:47.326465 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.326391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pdmrs\"" Apr 16 20:40:47.326465 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.326442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:40:47.326733 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.326478 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:40:47.326733 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.326434 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:40:47.326733 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.326524 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:40:47.331867 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.331845 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd8686788-v465s"] Apr 16 20:40:47.391891 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.391864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-console-config\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.392018 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.391901 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-oauth-config\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.392018 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.391929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-service-ca\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.392103 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.392024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-serving-cert\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.392103 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.392051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-oauth-serving-cert\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.392103 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.392071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8tt\" (UniqueName: \"kubernetes.io/projected/4501b828-24de-44b5-8e85-c0f813c98960-kube-api-access-fq8tt\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.421934 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.421909 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:40:47.424457 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.424439 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-7vq8v\"" Apr 16 20:40:47.433061 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.433046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f9kkl" Apr 16 20:40:47.494115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.492742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-serving-cert\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.492784 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-oauth-serving-cert\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.492820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8tt\" (UniqueName: \"kubernetes.io/projected/4501b828-24de-44b5-8e85-c0f813c98960-kube-api-access-fq8tt\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.492918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-console-config\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.492969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-oauth-config\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.492990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-service-ca\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494527 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.494316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-console-config\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494527 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.494349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-service-ca\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.494922 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.494889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-oauth-serving-cert\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.498066 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.497588 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-serving-cert\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.498066 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.498002 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-oauth-config\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.502999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.502977 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8tt\" (UniqueName: \"kubernetes.io/projected/4501b828-24de-44b5-8e85-c0f813c98960-kube-api-access-fq8tt\") pod \"console-5dd8686788-v465s\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.558054 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.557884 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f9kkl"] Apr 16 20:40:47.560059 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:47.560024 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35692de4_3b87_4697_b519_4f55d1e81778.slice/crio-2a33c10cc6296af7642a51e20dcf527847bd1889e922ba4d0c3fc3f36f224105 WatchSource:0}: Error finding container 2a33c10cc6296af7642a51e20dcf527847bd1889e922ba4d0c3fc3f36f224105: Status 404 returned error can't find the container with id 2a33c10cc6296af7642a51e20dcf527847bd1889e922ba4d0c3fc3f36f224105 Apr 16 20:40:47.631793 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.631757 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:47.786464 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.786431 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5dd8686788-v465s"] Apr 16 20:40:47.925564 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:40:47.925494 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4501b828_24de_44b5_8e85_c0f813c98960.slice/crio-170f54b7c2ffd08aecbcf11afc6fab97f58f5b49223aa4ab10a14e7e0a44b532 WatchSource:0}: Error finding container 170f54b7c2ffd08aecbcf11afc6fab97f58f5b49223aa4ab10a14e7e0a44b532: Status 404 returned error can't find the container with id 170f54b7c2ffd08aecbcf11afc6fab97f58f5b49223aa4ab10a14e7e0a44b532 Apr 16 20:40:47.975679 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.975651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f9kkl" event={"ID":"35692de4-3b87-4697-b519-4f55d1e81778","Type":"ContainerStarted","Data":"2a33c10cc6296af7642a51e20dcf527847bd1889e922ba4d0c3fc3f36f224105"} Apr 16 20:40:47.977029 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:47.977000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd8686788-v465s" event={"ID":"4501b828-24de-44b5-8e85-c0f813c98960","Type":"ContainerStarted","Data":"170f54b7c2ffd08aecbcf11afc6fab97f58f5b49223aa4ab10a14e7e0a44b532"} Apr 16 20:40:48.985253 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:48.985200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerStarted","Data":"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11"} Apr 16 20:40:49.023247 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:49.023187 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.035122414 podStartE2EDuration="7.023167886s" podCreationTimestamp="2026-04-16 20:40:42 +0000 UTC" firstStartedPulling="2026-04-16 20:40:42.980983425 +0000 UTC m=+164.133079459" lastFinishedPulling="2026-04-16 20:40:47.969028886 +0000 UTC m=+169.121124931" observedRunningTime="2026-04-16 20:40:49.019024914 +0000 UTC m=+170.171120969" watchObservedRunningTime="2026-04-16 20:40:49.023167886 +0000 UTC m=+170.175263940" Apr 16 20:40:51.421778 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:51.421703 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:40:51.953114 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:51.953083 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7lm8k" Apr 16 20:40:51.997407 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:51.997372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd8686788-v465s" event={"ID":"4501b828-24de-44b5-8e85-c0f813c98960","Type":"ContainerStarted","Data":"44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b"} Apr 16 20:40:51.998690 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:51.998663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f9kkl" event={"ID":"35692de4-3b87-4697-b519-4f55d1e81778","Type":"ContainerStarted","Data":"6c7b38304a64f270e6efe39e46b8d27c9606dac8188dead8cd49e9cba68dfbe2"} Apr 16 20:40:52.015904 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:52.015842 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5dd8686788-v465s" podStartSLOduration=1.762955416 podStartE2EDuration="5.015826253s" podCreationTimestamp="2026-04-16 20:40:47 +0000 UTC" firstStartedPulling="2026-04-16 20:40:47.92786578 +0000 UTC m=+169.079961813" lastFinishedPulling="2026-04-16 20:40:51.180736605 +0000 UTC m=+172.332832650" observedRunningTime="2026-04-16 20:40:52.015395354 +0000 UTC m=+173.167491400" watchObservedRunningTime="2026-04-16 20:40:52.015826253 +0000 UTC m=+173.167922308" Apr 16 20:40:52.032728 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:52.032689 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f9kkl" podStartSLOduration=137.461548604 podStartE2EDuration="2m21.032677286s" podCreationTimestamp="2026-04-16 20:38:31 +0000 UTC" firstStartedPulling="2026-04-16 20:40:47.562156231 +0000 UTC m=+168.714252262" lastFinishedPulling="2026-04-16 20:40:51.13328491 +0000 UTC m=+172.285380944" observedRunningTime="2026-04-16 20:40:52.032103131 +0000 UTC m=+173.184199176" watchObservedRunningTime="2026-04-16 20:40:52.032677286 +0000 UTC m=+173.184773338" Apr 16 20:40:52.914957 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:52.914928 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-58497df7cd-rdfqk" Apr 16 20:40:57.631997 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:57.631953 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:57.631997 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:57.632000 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:57.637151 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:57.637125 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:40:58.020918 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:40:58.020839 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:41:00.022455 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:00.022425 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-wvhq9" event={"ID":"4d826043-195a-403c-a1c0-b18a4ddf86fa","Type":"ContainerStarted","Data":"25090072e8743be158ce170f680fb7c54f037a5825b68b348762c79ff7dfc1b9"} Apr 16 20:41:00.022871 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:00.022595 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:41:00.023840 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:00.023814 2575 patch_prober.go:28] interesting pod/downloads-6bcc868b7-wvhq9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.14:8080/\": dial tcp 10.134.0.14:8080: connect: connection refused" start-of-body= Apr 16 20:41:00.023942 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:00.023858 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-wvhq9" podUID="4d826043-195a-403c-a1c0-b18a4ddf86fa" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.14:8080/\": dial tcp 10.134.0.14:8080: connect: connection refused" Apr 16 20:41:01.039959 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:01.039927 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-wvhq9" Apr 16 20:41:01.068331 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:01.068274 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-wvhq9" podStartSLOduration=2.096400603 podStartE2EDuration="20.06825383s" podCreationTimestamp="2026-04-16 20:40:41 +0000 UTC" firstStartedPulling="2026-04-16 20:40:41.919158216 +0000 UTC m=+163.071254246" lastFinishedPulling="2026-04-16 20:40:59.891011442 +0000 UTC m=+181.043107473" observedRunningTime="2026-04-16 20:41:00.059525339 +0000 UTC m=+181.211621391" watchObservedRunningTime="2026-04-16 20:41:01.06825383 +0000 UTC m=+182.220349881" Apr 16 20:41:08.179599 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:08.179560 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dd8686788-v465s"] Apr 16 20:41:26.092079 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:26.092042 2575 generic.go:358] "Generic (PLEG): container finished" podID="ca797f13-b8b1-4f9e-8374-336cc1c934f4" containerID="83c3d141ebc0a0b79bbfab4ab9d09ff7a1cf58c0ee755ee2dc31e3284a32da88" exitCode=0 Apr 16 20:41:26.092574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:26.092093 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" event={"ID":"ca797f13-b8b1-4f9e-8374-336cc1c934f4","Type":"ContainerDied","Data":"83c3d141ebc0a0b79bbfab4ab9d09ff7a1cf58c0ee755ee2dc31e3284a32da88"} Apr 16 20:41:26.092574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:26.092517 2575 scope.go:117] "RemoveContainer" containerID="83c3d141ebc0a0b79bbfab4ab9d09ff7a1cf58c0ee755ee2dc31e3284a32da88" Apr 16 20:41:27.097105 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:27.097074 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-x6rb4" event={"ID":"ca797f13-b8b1-4f9e-8374-336cc1c934f4","Type":"ContainerStarted","Data":"d4a884a3070dd83e1cb9b7d405ea3f9bb69d15a9a3e68a035ab94a954f76c360"} Apr 16 20:41:33.204772 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.204715 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5dd8686788-v465s" podUID="4501b828-24de-44b5-8e85-c0f813c98960" containerName="console" containerID="cri-o://44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b" gracePeriod=15 Apr 16 20:41:33.470272 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.470251 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dd8686788-v465s_4501b828-24de-44b5-8e85-c0f813c98960/console/0.log" Apr 16 20:41:33.470379 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.470319 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:41:33.482862 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.482842 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-oauth-serving-cert\") pod \"4501b828-24de-44b5-8e85-c0f813c98960\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " Apr 16 20:41:33.482932 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.482871 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-console-config\") pod \"4501b828-24de-44b5-8e85-c0f813c98960\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " Apr 16 20:41:33.482932 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.482901 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-oauth-config\") pod \"4501b828-24de-44b5-8e85-c0f813c98960\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " Apr 16 20:41:33.483000 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.482944 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-serving-cert\") pod \"4501b828-24de-44b5-8e85-c0f813c98960\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " Apr 16 20:41:33.483000 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.482962 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq8tt\" (UniqueName: \"kubernetes.io/projected/4501b828-24de-44b5-8e85-c0f813c98960-kube-api-access-fq8tt\") pod \"4501b828-24de-44b5-8e85-c0f813c98960\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " Apr 16 20:41:33.483000 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.482991 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-service-ca\") pod \"4501b828-24de-44b5-8e85-c0f813c98960\" (UID: \"4501b828-24de-44b5-8e85-c0f813c98960\") " Apr 16 20:41:33.483343 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.483253 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4501b828-24de-44b5-8e85-c0f813c98960" (UID: "4501b828-24de-44b5-8e85-c0f813c98960"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:33.483485 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.483440 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-service-ca" (OuterVolumeSpecName: "service-ca") pod "4501b828-24de-44b5-8e85-c0f813c98960" (UID: "4501b828-24de-44b5-8e85-c0f813c98960"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:33.483485 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.483448 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-console-config" (OuterVolumeSpecName: "console-config") pod "4501b828-24de-44b5-8e85-c0f813c98960" (UID: "4501b828-24de-44b5-8e85-c0f813c98960"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:41:33.485111 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.485085 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4501b828-24de-44b5-8e85-c0f813c98960" (UID: "4501b828-24de-44b5-8e85-c0f813c98960"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:41:33.485202 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.485149 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4501b828-24de-44b5-8e85-c0f813c98960-kube-api-access-fq8tt" (OuterVolumeSpecName: "kube-api-access-fq8tt") pod "4501b828-24de-44b5-8e85-c0f813c98960" (UID: "4501b828-24de-44b5-8e85-c0f813c98960"). InnerVolumeSpecName "kube-api-access-fq8tt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:41:33.485202 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.485170 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4501b828-24de-44b5-8e85-c0f813c98960" (UID: "4501b828-24de-44b5-8e85-c0f813c98960"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:41:33.583782 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.583754 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-oauth-serving-cert\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:41:33.583782 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.583784 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-console-config\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:41:33.583966 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.583797 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-oauth-config\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:41:33.583966 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.583811 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4501b828-24de-44b5-8e85-c0f813c98960-console-serving-cert\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:41:33.583966 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.583824 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fq8tt\" (UniqueName: \"kubernetes.io/projected/4501b828-24de-44b5-8e85-c0f813c98960-kube-api-access-fq8tt\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:41:33.583966 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:33.583836 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4501b828-24de-44b5-8e85-c0f813c98960-service-ca\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:41:34.117602 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.117579 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5dd8686788-v465s_4501b828-24de-44b5-8e85-c0f813c98960/console/0.log" Apr 16 20:41:34.117781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.117630 2575 generic.go:358] "Generic (PLEG): container finished" podID="4501b828-24de-44b5-8e85-c0f813c98960" containerID="44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b" exitCode=2 Apr 16 20:41:34.117781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.117660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd8686788-v465s" event={"ID":"4501b828-24de-44b5-8e85-c0f813c98960","Type":"ContainerDied","Data":"44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b"} Apr 16 20:41:34.117781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.117695 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5dd8686788-v465s" Apr 16 20:41:34.117781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.117703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5dd8686788-v465s" event={"ID":"4501b828-24de-44b5-8e85-c0f813c98960","Type":"ContainerDied","Data":"170f54b7c2ffd08aecbcf11afc6fab97f58f5b49223aa4ab10a14e7e0a44b532"} Apr 16 20:41:34.117781 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.117718 2575 scope.go:117] "RemoveContainer" containerID="44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b" Apr 16 20:41:34.126123 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.126107 2575 scope.go:117] "RemoveContainer" containerID="44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b" Apr 16 20:41:34.126382 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:41:34.126363 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b\": container with ID starting with 44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b not found: ID does not exist" containerID="44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b" Apr 16 20:41:34.126452 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.126393 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b"} err="failed to get container status \"44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b\": rpc error: code = NotFound desc = could not find container \"44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b\": container with ID starting with 44a45caea9b70fbe14f5263d90af7258c6f4db9ddcf4d8af2c39f8ad05f41a5b not found: ID does not exist" Apr 16 20:41:34.137716 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.137696 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5dd8686788-v465s"] Apr 16 20:41:34.142855 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:34.142835 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5dd8686788-v465s"] Apr 16 20:41:35.425282 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:41:35.425248 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4501b828-24de-44b5-8e85-c0f813c98960" path="/var/lib/kubelet/pods/4501b828-24de-44b5-8e85-c0f813c98960/volumes" Apr 16 20:42:01.426402 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.426370 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:42:01.426991 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.426941 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="alertmanager" containerID="cri-o://bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9" gracePeriod=120 Apr 16 20:42:01.427055 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.426993 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="prom-label-proxy" containerID="cri-o://41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11" gracePeriod=120 Apr 16 20:42:01.427055 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.426989 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-web" containerID="cri-o://020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed" gracePeriod=120 Apr 16 20:42:01.427055 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.427004 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="config-reloader" containerID="cri-o://ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a" gracePeriod=120 Apr 16 20:42:01.427055 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.426956 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-metric" containerID="cri-o://f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea" gracePeriod=120 Apr 16 20:42:01.427232 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:01.427060 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy" containerID="cri-o://c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be" gracePeriod=120 Apr 16 20:42:02.193761 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193732 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11" exitCode=0 Apr 16 20:42:02.193761 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193756 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be" exitCode=0 Apr 16 20:42:02.193761 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193762 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a" exitCode=0 Apr 16 20:42:02.193761 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193768 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9" exitCode=0 Apr 16 20:42:02.194019 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11"} Apr 16 20:42:02.194019 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be"} Apr 16 20:42:02.194019 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a"} Apr 16 20:42:02.194019 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.193852 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9"} Apr 16 20:42:02.666496 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.666472 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:02.791133 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791054 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-metrics-client-ca\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791133 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791091 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-metric\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791135 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-main-tls\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791174 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-trusted-ca-bundle\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791211 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-main-db\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791240 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-tls-assets\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791271 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-cluster-tls-config\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791320 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791297 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kkc4\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-kube-api-access-8kkc4\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791334 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-web\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791364 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791390 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-config-volume\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791412 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-config-out\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791449 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-web-config\") pod \"228c50a5-2d6a-477d-ade1-7022cad32554\" (UID: \"228c50a5-2d6a-477d-ade1-7022cad32554\") " Apr 16 20:42:02.791607 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791448 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:02.791945 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791605 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:02.791945 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791597 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:02.791945 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791711 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-main-db\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.791945 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791740 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-metrics-client-ca\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.791945 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.791761 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228c50a5-2d6a-477d-ade1-7022cad32554-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.794042 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794011 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.794217 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794193 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:42:02.794583 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794534 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-kube-api-access-8kkc4" (OuterVolumeSpecName: "kube-api-access-8kkc4") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "kube-api-access-8kkc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:42:02.794830 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794779 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-config-volume" (OuterVolumeSpecName: "config-volume") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.794830 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794803 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.794928 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794896 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-config-out" (OuterVolumeSpecName: "config-out") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:02.794928 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.794908 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.795748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.795729 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.797949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.797851 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.804298 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.804276 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-web-config" (OuterVolumeSpecName: "web-config") pod "228c50a5-2d6a-477d-ade1-7022cad32554" (UID: "228c50a5-2d6a-477d-ade1-7022cad32554"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:42:02.892456 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892434 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-cluster-tls-config\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892456 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892456 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kkc4\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-kube-api-access-8kkc4\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892467 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892477 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892486 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-config-volume\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892494 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/228c50a5-2d6a-477d-ade1-7022cad32554-config-out\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892503 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-web-config\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892512 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892523 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/228c50a5-2d6a-477d-ade1-7022cad32554-secret-alertmanager-main-tls\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:02.892587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:02.892532 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/228c50a5-2d6a-477d-ade1-7022cad32554-tls-assets\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:42:03.199048 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199022 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea" exitCode=0 Apr 16 20:42:03.199048 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199044 2575 generic.go:358] "Generic (PLEG): container finished" podID="228c50a5-2d6a-477d-ade1-7022cad32554" containerID="020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed" exitCode=0 Apr 16 20:42:03.199290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea"} Apr 16 20:42:03.199290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199122 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.199290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199156 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed"} Apr 16 20:42:03.199290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"228c50a5-2d6a-477d-ade1-7022cad32554","Type":"ContainerDied","Data":"2ae86bb612fa7410ba0681e5ebe75d3afe033c362a3215eaf8bef6b45799b773"} Apr 16 20:42:03.199290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.199192 2575 scope.go:117] "RemoveContainer" containerID="41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11" Apr 16 20:42:03.206539 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.206495 2575 scope.go:117] "RemoveContainer" containerID="f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea" Apr 16 20:42:03.212728 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.212712 2575 scope.go:117] "RemoveContainer" containerID="c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be" Apr 16 20:42:03.218644 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.218627 2575 scope.go:117] "RemoveContainer" containerID="020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed" Apr 16 20:42:03.221866 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.221846 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:42:03.224935 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.224905 2575 scope.go:117] "RemoveContainer" containerID="ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a" Apr 16 20:42:03.227028 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.227009 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:42:03.230905 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.230891 2575 scope.go:117] "RemoveContainer" containerID="bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9" Apr 16 20:42:03.236968 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.236952 2575 scope.go:117] "RemoveContainer" containerID="361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d" Apr 16 20:42:03.242505 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.242490 2575 scope.go:117] "RemoveContainer" containerID="41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11" Apr 16 20:42:03.242741 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.242723 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11\": container with ID starting with 41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11 not found: ID does not exist" containerID="41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11" Apr 16 20:42:03.242788 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.242748 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11"} err="failed to get container status \"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11\": rpc error: code = NotFound desc = could not find container \"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11\": container with ID starting with 41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11 not found: ID does not exist" Apr 16 20:42:03.242788 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.242763 2575 scope.go:117] "RemoveContainer" containerID="f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea" Apr 16 20:42:03.242964 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.242948 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea\": container with ID starting with f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea not found: ID does not exist" containerID="f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea" Apr 16 20:42:03.243002 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.242968 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea"} err="failed to get container status \"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea\": rpc error: code = NotFound desc = could not find container \"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea\": container with ID starting with f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea not found: ID does not exist" Apr 16 20:42:03.243002 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.242980 2575 scope.go:117] "RemoveContainer" containerID="c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be" Apr 16 20:42:03.243213 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.243194 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be\": container with ID starting with c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be not found: ID does not exist" containerID="c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be" Apr 16 20:42:03.243264 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243215 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be"} err="failed to get container status \"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be\": rpc error: code = NotFound desc = could not find container \"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be\": container with ID starting with c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be not found: ID does not exist" Apr 16 20:42:03.243264 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243227 2575 scope.go:117] "RemoveContainer" containerID="020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed" Apr 16 20:42:03.243405 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.243390 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed\": container with ID starting with 020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed not found: ID does not exist" containerID="020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed" Apr 16 20:42:03.243452 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243409 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed"} err="failed to get container status \"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed\": rpc error: code = NotFound desc = could not find container \"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed\": container with ID starting with 020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed not found: ID does not exist" Apr 16 20:42:03.243452 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243422 2575 scope.go:117] "RemoveContainer" containerID="ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a" Apr 16 20:42:03.243651 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.243624 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a\": container with ID starting with ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a not found: ID does not exist" containerID="ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a" Apr 16 20:42:03.243748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243656 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a"} err="failed to get container status \"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a\": rpc error: code = NotFound desc = could not find container \"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a\": container with ID starting with ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a not found: ID does not exist" Apr 16 20:42:03.243748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243678 2575 scope.go:117] "RemoveContainer" containerID="bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9" Apr 16 20:42:03.243912 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.243896 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9\": container with ID starting with bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9 not found: ID does not exist" containerID="bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9" Apr 16 20:42:03.243962 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243917 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9"} err="failed to get container status \"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9\": rpc error: code = NotFound desc = could not find container \"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9\": container with ID starting with bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9 not found: ID does not exist" Apr 16 20:42:03.243962 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.243938 2575 scope.go:117] "RemoveContainer" containerID="361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d" Apr 16 20:42:03.244192 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:42:03.244176 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d\": container with ID starting with 361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d not found: ID does not exist" containerID="361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d" Apr 16 20:42:03.244258 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244194 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d"} err="failed to get container status \"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d\": rpc error: code = NotFound desc = could not find container \"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d\": container with ID starting with 361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d not found: ID does not exist" Apr 16 20:42:03.244258 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244205 2575 scope.go:117] "RemoveContainer" containerID="41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11" Apr 16 20:42:03.244459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244387 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11"} err="failed to get container status \"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11\": rpc error: code = NotFound desc = could not find container \"41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11\": container with ID starting with 41fafbc10d4eff0ba2984cde372ca0d4d2667863a66725ee4b932551ef387d11 not found: ID does not exist" Apr 16 20:42:03.244459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244436 2575 scope.go:117] "RemoveContainer" containerID="f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea" Apr 16 20:42:03.244667 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244649 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea"} err="failed to get container status \"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea\": rpc error: code = NotFound desc = could not find container \"f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea\": container with ID starting with f74a382365ce49ced8863cd182cb594ffc5a9ee05e19f9d0c0433f30086c27ea not found: ID does not exist" Apr 16 20:42:03.244723 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244667 2575 scope.go:117] "RemoveContainer" containerID="c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be" Apr 16 20:42:03.244843 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244827 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be"} err="failed to get container status \"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be\": rpc error: code = NotFound desc = could not find container \"c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be\": container with ID starting with c518d13d064cbc34c1df70d4d20f52fe265b9bf2eff47b121c030f967bec44be not found: ID does not exist" Apr 16 20:42:03.244884 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.244842 2575 scope.go:117] "RemoveContainer" containerID="020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed" Apr 16 20:42:03.245026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245010 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed"} err="failed to get container status \"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed\": rpc error: code = NotFound desc = could not find container \"020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed\": container with ID starting with 020e9f44339d02ccfeee10ea865643dd445e2bebfc421359d592a39db3164fed not found: ID does not exist" Apr 16 20:42:03.245076 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245026 2575 scope.go:117] "RemoveContainer" containerID="ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a" Apr 16 20:42:03.245213 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245192 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a"} err="failed to get container status \"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a\": rpc error: code = NotFound desc = could not find container \"ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a\": container with ID starting with ab69550c971ae2b1b57ca02748ec3288cf41d4cf4105e7422cd8d1f20b2a448a not found: ID does not exist" Apr 16 20:42:03.245255 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245214 2575 scope.go:117] "RemoveContainer" containerID="bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9" Apr 16 20:42:03.245433 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245417 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9"} err="failed to get container status \"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9\": rpc error: code = NotFound desc = could not find container \"bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9\": container with ID starting with bac2eab82036961d1f36f8a1ba1d0f9b0bd5b47f6fab4d5c52c3701f757209d9 not found: ID does not exist" Apr 16 20:42:03.245433 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245431 2575 scope.go:117] "RemoveContainer" containerID="361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d" Apr 16 20:42:03.245629 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.245602 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d"} err="failed to get container status \"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d\": rpc error: code = NotFound desc = could not find container \"361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d\": container with ID starting with 361bf9ba0223ea7b0fe8dbbce147e9427da77c7e9d0df64698316821fd77cf1d not found: ID does not exist" Apr 16 20:42:03.253389 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253367 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:42:03.253590 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253579 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy" Apr 16 20:42:03.253648 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253592 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy" Apr 16 20:42:03.253648 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253602 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-web" Apr 16 20:42:03.253648 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253607 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-web" Apr 16 20:42:03.253648 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253637 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="alertmanager" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253644 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="alertmanager" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253670 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-metric" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253675 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-metric" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253681 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="prom-label-proxy" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253686 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="prom-label-proxy" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253693 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4501b828-24de-44b5-8e85-c0f813c98960" containerName="console" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253699 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4501b828-24de-44b5-8e85-c0f813c98960" containerName="console" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253705 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="config-reloader" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253710 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="config-reloader" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253718 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="init-config-reloader" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253724 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="init-config-reloader" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253763 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-metric" Apr 16 20:42:03.253768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253770 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy" Apr 16 20:42:03.254156 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253776 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="prom-label-proxy" Apr 16 20:42:03.254156 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253784 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4501b828-24de-44b5-8e85-c0f813c98960" containerName="console" Apr 16 20:42:03.254156 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253790 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="kube-rbac-proxy-web" Apr 16 20:42:03.254156 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253797 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="alertmanager" Apr 16 20:42:03.254156 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.253803 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" containerName="config-reloader" Apr 16 20:42:03.260049 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.260026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.262434 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262405 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 20:42:03.262513 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262453 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-wzssz\"" Apr 16 20:42:03.262513 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262497 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 20:42:03.262629 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262524 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 20:42:03.262705 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 20:42:03.262835 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262819 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 20:42:03.262912 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262887 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 20:42:03.262912 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262894 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 20:42:03.263009 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.262949 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 20:42:03.267530 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.267509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 20:42:03.270043 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.270023 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:42:03.294835 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.294814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9e47c0c-bfeb-4a60-b679-98cd214d053a-config-out\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.294918 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.294840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.294918 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.294857 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9e47c0c-bfeb-4a60-b679-98cd214d053a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.294988 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.294918 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9e47c0c-bfeb-4a60-b679-98cd214d053a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.294988 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.294955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.294988 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.294980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-web-config\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295078 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295005 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl6cf\" (UniqueName: \"kubernetes.io/projected/e9e47c0c-bfeb-4a60-b679-98cd214d053a-kube-api-access-kl6cf\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295078 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9e47c0c-bfeb-4a60-b679-98cd214d053a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295078 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295078 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295051 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9e47c0c-bfeb-4a60-b679-98cd214d053a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295206 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295206 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-config-volume\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.295206 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.295137 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396426 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396395 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-config-volume\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396586 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396456 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396586 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9e47c0c-bfeb-4a60-b679-98cd214d053a-config-out\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396586 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396586 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9e47c0c-bfeb-4a60-b679-98cd214d053a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396602 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9e47c0c-bfeb-4a60-b679-98cd214d053a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396656 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-web-config\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396956 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kl6cf\" (UniqueName: \"kubernetes.io/projected/e9e47c0c-bfeb-4a60-b679-98cd214d053a-kube-api-access-kl6cf\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396956 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9e47c0c-bfeb-4a60-b679-98cd214d053a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396956 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396915 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.396956 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9e47c0c-bfeb-4a60-b679-98cd214d053a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.397151 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.396985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.397697 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.397344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e9e47c0c-bfeb-4a60-b679-98cd214d053a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.397815 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.397763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9e47c0c-bfeb-4a60-b679-98cd214d053a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.398033 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.398009 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/e9e47c0c-bfeb-4a60-b679-98cd214d053a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.399716 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.399694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.399716 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.399702 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-config-volume\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.399869 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.399847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e9e47c0c-bfeb-4a60-b679-98cd214d053a-config-out\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.400015 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.399986 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.400205 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.400187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e9e47c0c-bfeb-4a60-b679-98cd214d053a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.400276 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.400192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.400437 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.400418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.400437 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.400431 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.401556 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.401535 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e9e47c0c-bfeb-4a60-b679-98cd214d053a-web-config\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.406391 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.406368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl6cf\" (UniqueName: \"kubernetes.io/projected/e9e47c0c-bfeb-4a60-b679-98cd214d053a-kube-api-access-kl6cf\") pod \"alertmanager-main-0\" (UID: \"e9e47c0c-bfeb-4a60-b679-98cd214d053a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.424829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.424809 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228c50a5-2d6a-477d-ade1-7022cad32554" path="/var/lib/kubelet/pods/228c50a5-2d6a-477d-ade1-7022cad32554/volumes" Apr 16 20:42:03.569826 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.569799 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 20:42:03.695686 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:03.695654 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 20:42:03.701767 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:42:03.701742 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e47c0c_bfeb_4a60_b679_98cd214d053a.slice/crio-922d22e9fd8b639a5b7296602a4c0c59c9e8ca10b90d0c22ca0a271efcabbf9e WatchSource:0}: Error finding container 922d22e9fd8b639a5b7296602a4c0c59c9e8ca10b90d0c22ca0a271efcabbf9e: Status 404 returned error can't find the container with id 922d22e9fd8b639a5b7296602a4c0c59c9e8ca10b90d0c22ca0a271efcabbf9e Apr 16 20:42:04.203599 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:04.203565 2575 generic.go:358] "Generic (PLEG): container finished" podID="e9e47c0c-bfeb-4a60-b679-98cd214d053a" containerID="853efc269ed83dab11e5d1c1f448b6af74004510b100107aba5ed7d3c848e830" exitCode=0 Apr 16 20:42:04.203773 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:04.203644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerDied","Data":"853efc269ed83dab11e5d1c1f448b6af74004510b100107aba5ed7d3c848e830"} Apr 16 20:42:04.203773 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:04.203665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"922d22e9fd8b639a5b7296602a4c0c59c9e8ca10b90d0c22ca0a271efcabbf9e"} Apr 16 20:42:05.210036 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.209986 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"208be881d1871f32d3209b3ea4db252a5ff4f90d02fa5812dcc849322558e93a"} Apr 16 20:42:05.210036 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.210038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"b398947605fff347ac97911102b7c430cad61fc6cdc3a09c37c10ceafb0680f0"} Apr 16 20:42:05.210459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.210050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"cdaf0d31dba39e58b246fdfafc6b2171544273e489db7b59c0b77c55c4aa95e8"} Apr 16 20:42:05.210459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.210058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"b99f6cf9c2928caffd47f1077dd208d69e5cd08d86fbc29c2b13413b13f40098"} Apr 16 20:42:05.210459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.210067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"45696a0d7fdeac3be69865e41163a257719eca2e0e7c72dfed3fd5ec090e5846"} Apr 16 20:42:05.210459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.210074 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"e9e47c0c-bfeb-4a60-b679-98cd214d053a","Type":"ContainerStarted","Data":"a589d45c7ab74b0675cd10131ddea9426b48a2d87cdc45255c7c40818d75818a"} Apr 16 20:42:05.234748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.234703 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.23468943 podStartE2EDuration="2.23468943s" podCreationTimestamp="2026-04-16 20:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:42:05.23310965 +0000 UTC m=+246.385205713" watchObservedRunningTime="2026-04-16 20:42:05.23468943 +0000 UTC m=+246.386785482" Apr 16 20:42:05.448626 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.448575 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-56c74b8f5-kb62f"] Apr 16 20:42:05.451413 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.451391 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.453735 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.453710 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 20:42:05.454187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.454153 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 20:42:05.454291 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.454204 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 20:42:05.454291 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.454228 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 20:42:05.454748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.454562 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 20:42:05.454748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.454596 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-9572n\"" Apr 16 20:42:05.463465 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.463443 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 20:42:05.463574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.463521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56c74b8f5-kb62f"] Apr 16 20:42:05.510922 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.510904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b8c6\" (UniqueName: \"kubernetes.io/projected/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-kube-api-access-8b8c6\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511030 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.510935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-secret-telemeter-client\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511030 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.510977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-telemeter-client-tls\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511030 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.510994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-federate-client-tls\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511030 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.511010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511030 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.511030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-metrics-client-ca\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511231 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.511048 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-serving-certs-ca-bundle\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.511231 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.511112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.611885 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.611858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-metrics-client-ca\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.611982 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.611891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-serving-certs-ca-bundle\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.611982 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.611914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.611982 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.611944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b8c6\" (UniqueName: \"kubernetes.io/projected/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-kube-api-access-8b8c6\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.611982 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.611966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-secret-telemeter-client\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.612187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.612008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-telemeter-client-tls\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.612187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.612026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-federate-client-tls\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.612187 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.612051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.612672 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.612651 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-serving-certs-ca-bundle\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.612826 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.612671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-metrics-client-ca\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.612938 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.612917 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-telemeter-trusted-ca-bundle\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.614475 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.614454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-federate-client-tls\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.614626 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.614599 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-telemeter-client-tls\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.614874 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.614857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-secret-telemeter-client\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.614951 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.614928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.622452 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.622432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b8c6\" (UniqueName: \"kubernetes.io/projected/a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20-kube-api-access-8b8c6\") pod \"telemeter-client-56c74b8f5-kb62f\" (UID: \"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20\") " pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.764803 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.764737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" Apr 16 20:42:05.879527 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:05.879504 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-56c74b8f5-kb62f"] Apr 16 20:42:05.881750 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:42:05.881722 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3f7ed98_6bf3_4e8f_8ccf_2974cfe91f20.slice/crio-f8e8dcaa166b99943ba672baec39ee85cd011cb3304858db0d43efe4ebd68bab WatchSource:0}: Error finding container f8e8dcaa166b99943ba672baec39ee85cd011cb3304858db0d43efe4ebd68bab: Status 404 returned error can't find the container with id f8e8dcaa166b99943ba672baec39ee85cd011cb3304858db0d43efe4ebd68bab Apr 16 20:42:06.214017 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:06.213981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" event={"ID":"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20","Type":"ContainerStarted","Data":"f8e8dcaa166b99943ba672baec39ee85cd011cb3304858db0d43efe4ebd68bab"} Apr 16 20:42:09.223524 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:09.223490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" event={"ID":"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20","Type":"ContainerStarted","Data":"69863ee8a5defd5ebb35802e6d7e31807e14b9b6d46a174c78ea89a477b8128e"} Apr 16 20:42:09.223524 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:09.223523 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" event={"ID":"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20","Type":"ContainerStarted","Data":"180d559df260792650afb2101146f9a9e77744603ee8a37c8328bfff650e90e6"} Apr 16 20:42:09.223930 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:09.223534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" event={"ID":"a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20","Type":"ContainerStarted","Data":"77e71654101d891940aabbde0b8ae42f0064e4953db86b0a383efc9489d64b3c"} Apr 16 20:42:09.245341 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:09.245291 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-56c74b8f5-kb62f" podStartSLOduration=1.7704092120000001 podStartE2EDuration="4.245276923s" podCreationTimestamp="2026-04-16 20:42:05 +0000 UTC" firstStartedPulling="2026-04-16 20:42:05.883609211 +0000 UTC m=+247.035705243" lastFinishedPulling="2026-04-16 20:42:08.358476923 +0000 UTC m=+249.510572954" observedRunningTime="2026-04-16 20:42:09.244210738 +0000 UTC m=+250.396306790" watchObservedRunningTime="2026-04-16 20:42:09.245276923 +0000 UTC m=+250.397372975" Apr 16 20:42:10.250071 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:10.250032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:42:10.252328 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:10.252308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/422c9f50-4f45-46bc-9e9d-5c4f1c20c115-metrics-certs\") pod \"network-metrics-daemon-5jhhm\" (UID: \"422c9f50-4f45-46bc-9e9d-5c4f1c20c115\") " pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:42:10.324515 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:10.324482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vdj2n\"" Apr 16 20:42:10.332849 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:10.332822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5jhhm" Apr 16 20:42:10.445824 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:10.445796 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5jhhm"] Apr 16 20:42:10.448596 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:42:10.448571 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod422c9f50_4f45_46bc_9e9d_5c4f1c20c115.slice/crio-7e31f5cf92a468fa7849cfa7da7ab79bd7306762eba71098544f2de1b77732cf WatchSource:0}: Error finding container 7e31f5cf92a468fa7849cfa7da7ab79bd7306762eba71098544f2de1b77732cf: Status 404 returned error can't find the container with id 7e31f5cf92a468fa7849cfa7da7ab79bd7306762eba71098544f2de1b77732cf Apr 16 20:42:11.231171 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:11.231129 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jhhm" event={"ID":"422c9f50-4f45-46bc-9e9d-5c4f1c20c115","Type":"ContainerStarted","Data":"7e31f5cf92a468fa7849cfa7da7ab79bd7306762eba71098544f2de1b77732cf"} Apr 16 20:42:12.235772 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:12.235738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jhhm" event={"ID":"422c9f50-4f45-46bc-9e9d-5c4f1c20c115","Type":"ContainerStarted","Data":"22a68b20fd4c2f4a2b2bdcff720316ab2acb9e047ac690c10d6d65013f38bf6b"} Apr 16 20:42:12.235772 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:12.235772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5jhhm" event={"ID":"422c9f50-4f45-46bc-9e9d-5c4f1c20c115","Type":"ContainerStarted","Data":"a002fa5af1dedc26dcb24a62d8e48ce655db2aa47acc0c40ebfdee7248943ed8"} Apr 16 20:42:12.253657 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:12.253579 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5jhhm" podStartSLOduration=252.171982526 podStartE2EDuration="4m13.253561159s" podCreationTimestamp="2026-04-16 20:37:59 +0000 UTC" firstStartedPulling="2026-04-16 20:42:10.450889934 +0000 UTC m=+251.602985964" lastFinishedPulling="2026-04-16 20:42:11.532468563 +0000 UTC m=+252.684564597" observedRunningTime="2026-04-16 20:42:12.252063752 +0000 UTC m=+253.404159806" watchObservedRunningTime="2026-04-16 20:42:12.253561159 +0000 UTC m=+253.405657212" Apr 16 20:42:59.307733 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:59.307703 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:42:59.308514 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:59.308250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:42:59.320437 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:42:59.320415 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:44:04.437020 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.436988 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hklnp"] Apr 16 20:44:04.439932 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.439916 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.442196 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.442180 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:44:04.446642 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.446603 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hklnp"] Apr 16 20:44:04.566920 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.566889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aca421fe-4ae3-4942-acd4-e16928e3c32d-dbus\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.567083 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.566964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aca421fe-4ae3-4942-acd4-e16928e3c32d-kubelet-config\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.567083 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.566995 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aca421fe-4ae3-4942-acd4-e16928e3c32d-original-pull-secret\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.668092 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.668060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aca421fe-4ae3-4942-acd4-e16928e3c32d-dbus\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.668237 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.668113 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aca421fe-4ae3-4942-acd4-e16928e3c32d-kubelet-config\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.668237 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.668134 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aca421fe-4ae3-4942-acd4-e16928e3c32d-original-pull-secret\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.668321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.668251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aca421fe-4ae3-4942-acd4-e16928e3c32d-kubelet-config\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.668321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.668251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aca421fe-4ae3-4942-acd4-e16928e3c32d-dbus\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.670181 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.670160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aca421fe-4ae3-4942-acd4-e16928e3c32d-original-pull-secret\") pod \"global-pull-secret-syncer-hklnp\" (UID: \"aca421fe-4ae3-4942-acd4-e16928e3c32d\") " pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.748993 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.748918 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hklnp" Apr 16 20:44:04.864306 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.864276 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hklnp"] Apr 16 20:44:04.867511 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:44:04.867482 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca421fe_4ae3_4942_acd4_e16928e3c32d.slice/crio-09e2846698121659f53718909d65b5a95449184490495e7c0d5815d44ed46d47 WatchSource:0}: Error finding container 09e2846698121659f53718909d65b5a95449184490495e7c0d5815d44ed46d47: Status 404 returned error can't find the container with id 09e2846698121659f53718909d65b5a95449184490495e7c0d5815d44ed46d47 Apr 16 20:44:04.869171 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:04.869151 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:44:05.545166 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:05.545127 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hklnp" event={"ID":"aca421fe-4ae3-4942-acd4-e16928e3c32d","Type":"ContainerStarted","Data":"09e2846698121659f53718909d65b5a95449184490495e7c0d5815d44ed46d47"} Apr 16 20:44:09.557282 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:09.557192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hklnp" event={"ID":"aca421fe-4ae3-4942-acd4-e16928e3c32d","Type":"ContainerStarted","Data":"aa4bd1cb7966f725a049f6827d1ffc041802bacedd9fcde3b8da31b08285e97a"} Apr 16 20:44:09.571292 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:09.571243 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hklnp" podStartSLOduration=1.327914171 podStartE2EDuration="5.571229165s" podCreationTimestamp="2026-04-16 20:44:04 +0000 UTC" firstStartedPulling="2026-04-16 20:44:04.869281547 +0000 UTC m=+366.021377578" lastFinishedPulling="2026-04-16 20:44:09.112596541 +0000 UTC m=+370.264692572" observedRunningTime="2026-04-16 20:44:09.570492666 +0000 UTC m=+370.722588716" watchObservedRunningTime="2026-04-16 20:44:09.571229165 +0000 UTC m=+370.723325218" Apr 16 20:44:25.848398 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.848357 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2"] Apr 16 20:44:25.851569 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.851549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:25.854892 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.854857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 20:44:25.855008 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.854900 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-vvcdm\"" Apr 16 20:44:25.855008 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.854912 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:44:25.855008 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.854942 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:44:25.855008 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.854903 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:44:25.857752 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.857727 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2"] Apr 16 20:44:25.942672 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.942634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/80d07f53-2501-4419-9be3-f64273e8de47-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2\" (UID: \"80d07f53-2501-4419-9be3-f64273e8de47\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:25.942856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:25.942812 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k5ws\" (UniqueName: \"kubernetes.io/projected/80d07f53-2501-4419-9be3-f64273e8de47-kube-api-access-8k5ws\") pod \"managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2\" (UID: \"80d07f53-2501-4419-9be3-f64273e8de47\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:26.043949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.043913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/80d07f53-2501-4419-9be3-f64273e8de47-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2\" (UID: \"80d07f53-2501-4419-9be3-f64273e8de47\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:26.044148 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.043963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8k5ws\" (UniqueName: \"kubernetes.io/projected/80d07f53-2501-4419-9be3-f64273e8de47-kube-api-access-8k5ws\") pod \"managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2\" (UID: \"80d07f53-2501-4419-9be3-f64273e8de47\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:26.046968 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.046944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/80d07f53-2501-4419-9be3-f64273e8de47-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2\" (UID: \"80d07f53-2501-4419-9be3-f64273e8de47\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:26.051815 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.051789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k5ws\" (UniqueName: \"kubernetes.io/projected/80d07f53-2501-4419-9be3-f64273e8de47-kube-api-access-8k5ws\") pod \"managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2\" (UID: \"80d07f53-2501-4419-9be3-f64273e8de47\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:26.179258 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.179158 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" Apr 16 20:44:26.292663 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.292571 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2"] Apr 16 20:44:26.295192 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:44:26.295154 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d07f53_2501_4419_9be3_f64273e8de47.slice/crio-50d8f9f794f6fa4adee5f852ef7eea07a98e9dfb280f8dbabd3ab725f55cafb2 WatchSource:0}: Error finding container 50d8f9f794f6fa4adee5f852ef7eea07a98e9dfb280f8dbabd3ab725f55cafb2: Status 404 returned error can't find the container with id 50d8f9f794f6fa4adee5f852ef7eea07a98e9dfb280f8dbabd3ab725f55cafb2 Apr 16 20:44:26.602417 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:26.602384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" event={"ID":"80d07f53-2501-4419-9be3-f64273e8de47","Type":"ContainerStarted","Data":"50d8f9f794f6fa4adee5f852ef7eea07a98e9dfb280f8dbabd3ab725f55cafb2"} Apr 16 20:44:29.613532 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:29.613497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" event={"ID":"80d07f53-2501-4419-9be3-f64273e8de47","Type":"ContainerStarted","Data":"78bbd27f711686cc844026299b1f65f3db362564bc68d4adc3e3e6d90ae0dcf4"} Apr 16 20:44:29.629067 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:29.629018 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5c8ddd6487-c7tw2" podStartSLOduration=2.317325686 podStartE2EDuration="4.629004033s" podCreationTimestamp="2026-04-16 20:44:25 +0000 UTC" firstStartedPulling="2026-04-16 20:44:26.297040574 +0000 UTC m=+387.449136616" lastFinishedPulling="2026-04-16 20:44:28.608718933 +0000 UTC m=+389.760814963" observedRunningTime="2026-04-16 20:44:29.628136164 +0000 UTC m=+390.780232231" watchObservedRunningTime="2026-04-16 20:44:29.629004033 +0000 UTC m=+390.781100086" Apr 16 20:44:50.065064 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.065030 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq"] Apr 16 20:44:50.068343 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.068327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.070736 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.070710 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:44:50.070840 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.070779 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:44:50.071571 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.071556 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7g5mm\"" Apr 16 20:44:50.074776 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.074755 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq"] Apr 16 20:44:50.100593 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.100568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.100707 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.100602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.100707 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.100675 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw5dv\" (UniqueName: \"kubernetes.io/projected/b8b04784-5ad9-4bed-aa49-a333ceb310dc-kube-api-access-dw5dv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.201646 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.201593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.201745 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.201691 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.201745 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.201732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dw5dv\" (UniqueName: \"kubernetes.io/projected/b8b04784-5ad9-4bed-aa49-a333ceb310dc-kube-api-access-dw5dv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.202024 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.202004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.202090 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.202027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.214693 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.214671 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw5dv\" (UniqueName: \"kubernetes.io/projected/b8b04784-5ad9-4bed-aa49-a333ceb310dc-kube-api-access-dw5dv\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.378306 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.378270 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:44:50.501913 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.501886 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq"] Apr 16 20:44:50.504485 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:44:50.504457 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b04784_5ad9_4bed_aa49_a333ceb310dc.slice/crio-606e08b197acf64f3ceeac82c67fc1d467329b87736563013213efb113bfd7aa WatchSource:0}: Error finding container 606e08b197acf64f3ceeac82c67fc1d467329b87736563013213efb113bfd7aa: Status 404 returned error can't find the container with id 606e08b197acf64f3ceeac82c67fc1d467329b87736563013213efb113bfd7aa Apr 16 20:44:50.670572 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.670492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" event={"ID":"b8b04784-5ad9-4bed-aa49-a333ceb310dc","Type":"ContainerStarted","Data":"606e08b197acf64f3ceeac82c67fc1d467329b87736563013213efb113bfd7aa"} Apr 16 20:44:50.997556 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:50.997483 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-mndnn"] Apr 16 20:44:51.000829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.000808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.003464 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.003437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-ffpcd\"" Apr 16 20:44:51.003572 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.003437 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 20:44:51.004154 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.004136 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 20:44:51.006923 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.006900 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-mndnn"] Apr 16 20:44:51.007234 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.007213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f0d185c-3287-41ba-b3a6-efde93bd7a18-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-mndnn\" (UID: \"2f0d185c-3287-41ba-b3a6-efde93bd7a18\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.007336 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.007259 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6glg\" (UniqueName: \"kubernetes.io/projected/2f0d185c-3287-41ba-b3a6-efde93bd7a18-kube-api-access-v6glg\") pod \"cert-manager-webhook-597b96b99b-mndnn\" (UID: \"2f0d185c-3287-41ba-b3a6-efde93bd7a18\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.107745 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.107703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f0d185c-3287-41ba-b3a6-efde93bd7a18-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-mndnn\" (UID: \"2f0d185c-3287-41ba-b3a6-efde93bd7a18\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.108130 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.107755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6glg\" (UniqueName: \"kubernetes.io/projected/2f0d185c-3287-41ba-b3a6-efde93bd7a18-kube-api-access-v6glg\") pod \"cert-manager-webhook-597b96b99b-mndnn\" (UID: \"2f0d185c-3287-41ba-b3a6-efde93bd7a18\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.115445 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.115415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f0d185c-3287-41ba-b3a6-efde93bd7a18-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-mndnn\" (UID: \"2f0d185c-3287-41ba-b3a6-efde93bd7a18\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.115548 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.115534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6glg\" (UniqueName: \"kubernetes.io/projected/2f0d185c-3287-41ba-b3a6-efde93bd7a18-kube-api-access-v6glg\") pod \"cert-manager-webhook-597b96b99b-mndnn\" (UID: \"2f0d185c-3287-41ba-b3a6-efde93bd7a18\") " pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.313210 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.313141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:51.448723 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.448610 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-mndnn"] Apr 16 20:44:51.451015 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:44:51.450982 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f0d185c_3287_41ba_b3a6_efde93bd7a18.slice/crio-966cc50a3f09c77cd93e1cd503f18796ecf1d25e04e5c7894b7804a8d597256d WatchSource:0}: Error finding container 966cc50a3f09c77cd93e1cd503f18796ecf1d25e04e5c7894b7804a8d597256d: Status 404 returned error can't find the container with id 966cc50a3f09c77cd93e1cd503f18796ecf1d25e04e5c7894b7804a8d597256d Apr 16 20:44:51.674856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:51.674818 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" event={"ID":"2f0d185c-3287-41ba-b3a6-efde93bd7a18","Type":"ContainerStarted","Data":"966cc50a3f09c77cd93e1cd503f18796ecf1d25e04e5c7894b7804a8d597256d"} Apr 16 20:44:55.690086 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:55.690032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" event={"ID":"2f0d185c-3287-41ba-b3a6-efde93bd7a18","Type":"ContainerStarted","Data":"b7acfc2ddc7ffdcf6013ab0a6b129907fc8787337a4909402530e1a5e00c6da5"} Apr 16 20:44:55.690538 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:55.690187 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:44:55.714133 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:55.713869 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" podStartSLOduration=1.964079476 podStartE2EDuration="5.713853927s" podCreationTimestamp="2026-04-16 20:44:50 +0000 UTC" firstStartedPulling="2026-04-16 20:44:51.452824021 +0000 UTC m=+412.604920058" lastFinishedPulling="2026-04-16 20:44:55.202598472 +0000 UTC m=+416.354694509" observedRunningTime="2026-04-16 20:44:55.712643967 +0000 UTC m=+416.864740020" watchObservedRunningTime="2026-04-16 20:44:55.713853927 +0000 UTC m=+416.865949981" Apr 16 20:44:57.697004 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:57.696967 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerID="881bb2d83c98773f78d0de8cf920312ad8d099574effd7c9dd10ace197b7f8ba" exitCode=0 Apr 16 20:44:57.697355 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:44:57.697035 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" event={"ID":"b8b04784-5ad9-4bed-aa49-a333ceb310dc","Type":"ContainerDied","Data":"881bb2d83c98773f78d0de8cf920312ad8d099574effd7c9dd10ace197b7f8ba"} Apr 16 20:45:00.706698 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.706656 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerID="307319b2b76cd561889ebc8b5b533d39f437bc81e49cbb37a60244131826d729" exitCode=0 Apr 16 20:45:00.707112 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.706738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" event={"ID":"b8b04784-5ad9-4bed-aa49-a333ceb310dc","Type":"ContainerDied","Data":"307319b2b76cd561889ebc8b5b533d39f437bc81e49cbb37a60244131826d729"} Apr 16 20:45:00.848527 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.848502 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-6s6zx"] Apr 16 20:45:00.851400 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.851384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:00.853696 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.853676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-jfx22\"" Apr 16 20:45:00.859877 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.859857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6s6zx"] Apr 16 20:45:00.888977 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.888953 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96-bound-sa-token\") pod \"cert-manager-759f64656b-6s6zx\" (UID: \"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96\") " pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:00.889071 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.889003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnsbc\" (UniqueName: \"kubernetes.io/projected/b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96-kube-api-access-rnsbc\") pod \"cert-manager-759f64656b-6s6zx\" (UID: \"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96\") " pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:00.990349 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.990278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnsbc\" (UniqueName: \"kubernetes.io/projected/b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96-kube-api-access-rnsbc\") pod \"cert-manager-759f64656b-6s6zx\" (UID: \"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96\") " pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:00.990467 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.990348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96-bound-sa-token\") pod \"cert-manager-759f64656b-6s6zx\" (UID: \"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96\") " pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:00.998477 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.998458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96-bound-sa-token\") pod \"cert-manager-759f64656b-6s6zx\" (UID: \"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96\") " pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:00.998601 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:00.998585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnsbc\" (UniqueName: \"kubernetes.io/projected/b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96-kube-api-access-rnsbc\") pod \"cert-manager-759f64656b-6s6zx\" (UID: \"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96\") " pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:01.160547 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:01.160517 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-6s6zx" Apr 16 20:45:01.289923 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:01.289840 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-6s6zx"] Apr 16 20:45:01.292192 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:45:01.292153 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5dae3ab_ddd9_4d53_ae8a_4ecdf8d8ed96.slice/crio-117a714741375734cb9e3bb80770f1844e51f69b8c1c0a2ed359407cc2dbfbf6 WatchSource:0}: Error finding container 117a714741375734cb9e3bb80770f1844e51f69b8c1c0a2ed359407cc2dbfbf6: Status 404 returned error can't find the container with id 117a714741375734cb9e3bb80770f1844e51f69b8c1c0a2ed359407cc2dbfbf6 Apr 16 20:45:01.695555 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:01.695522 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-mndnn" Apr 16 20:45:01.711681 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:01.711651 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6s6zx" event={"ID":"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96","Type":"ContainerStarted","Data":"9e3322fad7662050b6928a6fbe860326127ebfed4794b0bf386d08dd2f17c86a"} Apr 16 20:45:01.712101 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:01.711687 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-6s6zx" event={"ID":"b5dae3ab-ddd9-4d53-ae8a-4ecdf8d8ed96","Type":"ContainerStarted","Data":"117a714741375734cb9e3bb80770f1844e51f69b8c1c0a2ed359407cc2dbfbf6"} Apr 16 20:45:07.732600 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:07.732569 2575 generic.go:358] "Generic (PLEG): container finished" podID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerID="6f8a50832f3754c114a13762871142c965119c957557ea05f156bc2df5c16bed" exitCode=0 Apr 16 20:45:07.732973 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:07.732655 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" event={"ID":"b8b04784-5ad9-4bed-aa49-a333ceb310dc","Type":"ContainerDied","Data":"6f8a50832f3754c114a13762871142c965119c957557ea05f156bc2df5c16bed"} Apr 16 20:45:07.751015 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:07.750975 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-6s6zx" podStartSLOduration=7.750964687 podStartE2EDuration="7.750964687s" podCreationTimestamp="2026-04-16 20:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:45:01.731219684 +0000 UTC m=+422.883315738" watchObservedRunningTime="2026-04-16 20:45:07.750964687 +0000 UTC m=+428.903060740" Apr 16 20:45:08.855809 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.855788 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:45:08.957253 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.957224 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-bundle\") pod \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " Apr 16 20:45:08.957392 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.957274 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw5dv\" (UniqueName: \"kubernetes.io/projected/b8b04784-5ad9-4bed-aa49-a333ceb310dc-kube-api-access-dw5dv\") pod \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " Apr 16 20:45:08.957392 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.957320 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-util\") pod \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\" (UID: \"b8b04784-5ad9-4bed-aa49-a333ceb310dc\") " Apr 16 20:45:08.957670 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.957649 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-bundle" (OuterVolumeSpecName: "bundle") pod "b8b04784-5ad9-4bed-aa49-a333ceb310dc" (UID: "b8b04784-5ad9-4bed-aa49-a333ceb310dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:45:08.959440 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.959418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8b04784-5ad9-4bed-aa49-a333ceb310dc-kube-api-access-dw5dv" (OuterVolumeSpecName: "kube-api-access-dw5dv") pod "b8b04784-5ad9-4bed-aa49-a333ceb310dc" (UID: "b8b04784-5ad9-4bed-aa49-a333ceb310dc"). InnerVolumeSpecName "kube-api-access-dw5dv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:45:08.961341 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:08.961322 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-util" (OuterVolumeSpecName: "util") pod "b8b04784-5ad9-4bed-aa49-a333ceb310dc" (UID: "b8b04784-5ad9-4bed-aa49-a333ceb310dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:45:09.057923 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:09.057868 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-bundle\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:45:09.057923 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:09.057890 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dw5dv\" (UniqueName: \"kubernetes.io/projected/b8b04784-5ad9-4bed-aa49-a333ceb310dc-kube-api-access-dw5dv\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:45:09.057923 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:09.057902 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8b04784-5ad9-4bed-aa49-a333ceb310dc-util\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:45:09.738873 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:09.738789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" event={"ID":"b8b04784-5ad9-4bed-aa49-a333ceb310dc","Type":"ContainerDied","Data":"606e08b197acf64f3ceeac82c67fc1d467329b87736563013213efb113bfd7aa"} Apr 16 20:45:09.738873 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:09.738833 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="606e08b197acf64f3ceeac82c67fc1d467329b87736563013213efb113bfd7aa" Apr 16 20:45:09.738873 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:09.738856 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fmffvq" Apr 16 20:45:14.289827 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.289795 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h"] Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290081 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="pull" Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290091 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="pull" Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290108 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="extract" Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290114 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="extract" Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290124 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="util" Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290130 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="util" Apr 16 20:45:14.290194 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.290172 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8b04784-5ad9-4bed-aa49-a333ceb310dc" containerName="extract" Apr 16 20:45:14.294200 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.294182 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.296825 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.296803 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:45:14.297798 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.297779 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-t56zg\"" Apr 16 20:45:14.297909 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.297786 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 20:45:14.306674 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.306651 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h"] Apr 16 20:45:14.397134 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.397105 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5d2r\" (UniqueName: \"kubernetes.io/projected/33b595a5-5518-4df2-abf2-db9264869040-kube-api-access-s5d2r\") pod \"openshift-lws-operator-bfc7f696d-6q45h\" (UID: \"33b595a5-5518-4df2-abf2-db9264869040\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.397278 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.397156 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b595a5-5518-4df2-abf2-db9264869040-tmp\") pod \"openshift-lws-operator-bfc7f696d-6q45h\" (UID: \"33b595a5-5518-4df2-abf2-db9264869040\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.498001 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.497974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5d2r\" (UniqueName: \"kubernetes.io/projected/33b595a5-5518-4df2-abf2-db9264869040-kube-api-access-s5d2r\") pod \"openshift-lws-operator-bfc7f696d-6q45h\" (UID: \"33b595a5-5518-4df2-abf2-db9264869040\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.498135 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.498021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b595a5-5518-4df2-abf2-db9264869040-tmp\") pod \"openshift-lws-operator-bfc7f696d-6q45h\" (UID: \"33b595a5-5518-4df2-abf2-db9264869040\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.498419 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.498402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/33b595a5-5518-4df2-abf2-db9264869040-tmp\") pod \"openshift-lws-operator-bfc7f696d-6q45h\" (UID: \"33b595a5-5518-4df2-abf2-db9264869040\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.506535 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.506514 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5d2r\" (UniqueName: \"kubernetes.io/projected/33b595a5-5518-4df2-abf2-db9264869040-kube-api-access-s5d2r\") pod \"openshift-lws-operator-bfc7f696d-6q45h\" (UID: \"33b595a5-5518-4df2-abf2-db9264869040\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.603089 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.603067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" Apr 16 20:45:14.721718 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.721688 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h"] Apr 16 20:45:14.725123 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:45:14.725096 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33b595a5_5518_4df2_abf2_db9264869040.slice/crio-7b05f9eb2ecb455b06fbcc26ea8840604b243af0cacf147644ed62336fbcdf91 WatchSource:0}: Error finding container 7b05f9eb2ecb455b06fbcc26ea8840604b243af0cacf147644ed62336fbcdf91: Status 404 returned error can't find the container with id 7b05f9eb2ecb455b06fbcc26ea8840604b243af0cacf147644ed62336fbcdf91 Apr 16 20:45:14.757171 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:14.757138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" event={"ID":"33b595a5-5518-4df2-abf2-db9264869040","Type":"ContainerStarted","Data":"7b05f9eb2ecb455b06fbcc26ea8840604b243af0cacf147644ed62336fbcdf91"} Apr 16 20:45:17.767511 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:17.767472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" event={"ID":"33b595a5-5518-4df2-abf2-db9264869040","Type":"ContainerStarted","Data":"0081e43fccdaff18b9e16c440115424ddef3b05e816ae1f189296f75b020f6bb"} Apr 16 20:45:17.784254 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:17.784196 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-6q45h" podStartSLOduration=1.3321204039999999 podStartE2EDuration="3.784179361s" podCreationTimestamp="2026-04-16 20:45:14 +0000 UTC" firstStartedPulling="2026-04-16 20:45:14.726949812 +0000 UTC m=+435.879045843" lastFinishedPulling="2026-04-16 20:45:17.179008765 +0000 UTC m=+438.331104800" observedRunningTime="2026-04-16 20:45:17.781295213 +0000 UTC m=+438.933391266" watchObservedRunningTime="2026-04-16 20:45:17.784179361 +0000 UTC m=+438.936275416" Apr 16 20:45:32.760310 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.760275 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf"] Apr 16 20:45:32.769547 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.769526 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.772451 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.772426 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 20:45:32.772570 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.772502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 20:45:32.772570 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.772526 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 20:45:32.772570 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.772558 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 20:45:32.772745 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.772502 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-lt7rc\"" Apr 16 20:45:32.778351 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.778331 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf"] Apr 16 20:45:32.849418 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.849385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dh6h\" (UniqueName: \"kubernetes.io/projected/8383b67d-5b7d-418f-bd5e-64050816cc24-kube-api-access-2dh6h\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.849537 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.849423 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8383b67d-5b7d-418f-bd5e-64050816cc24-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.849537 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.849456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8383b67d-5b7d-418f-bd5e-64050816cc24-webhook-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.949856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.949826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8383b67d-5b7d-418f-bd5e-64050816cc24-webhook-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.949985 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.949896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dh6h\" (UniqueName: \"kubernetes.io/projected/8383b67d-5b7d-418f-bd5e-64050816cc24-kube-api-access-2dh6h\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.949985 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.949918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8383b67d-5b7d-418f-bd5e-64050816cc24-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.952235 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.952207 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8383b67d-5b7d-418f-bd5e-64050816cc24-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.952328 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.952254 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8383b67d-5b7d-418f-bd5e-64050816cc24-webhook-cert\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:32.957442 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:32.957425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dh6h\" (UniqueName: \"kubernetes.io/projected/8383b67d-5b7d-418f-bd5e-64050816cc24-kube-api-access-2dh6h\") pod \"opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf\" (UID: \"8383b67d-5b7d-418f-bd5e-64050816cc24\") " pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:33.080439 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:33.080354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:33.206924 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:33.206902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf"] Apr 16 20:45:33.209153 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:45:33.209123 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8383b67d_5b7d_418f_bd5e_64050816cc24.slice/crio-f2a965c9e77bc2b0e4e5d299db403a9df389eda62862445e19892a4170b7be49 WatchSource:0}: Error finding container f2a965c9e77bc2b0e4e5d299db403a9df389eda62862445e19892a4170b7be49: Status 404 returned error can't find the container with id f2a965c9e77bc2b0e4e5d299db403a9df389eda62862445e19892a4170b7be49 Apr 16 20:45:33.816104 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:33.816068 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" event={"ID":"8383b67d-5b7d-418f-bd5e-64050816cc24","Type":"ContainerStarted","Data":"f2a965c9e77bc2b0e4e5d299db403a9df389eda62862445e19892a4170b7be49"} Apr 16 20:45:36.828155 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:36.828115 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" event={"ID":"8383b67d-5b7d-418f-bd5e-64050816cc24","Type":"ContainerStarted","Data":"a7d52d9b153545c76b3bbc8105dcd457723f74af5d80e9e99ffe414355be025e"} Apr 16 20:45:36.828155 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:36.828159 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:36.848016 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:36.847960 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" podStartSLOduration=2.299850523 podStartE2EDuration="4.847948166s" podCreationTimestamp="2026-04-16 20:45:32 +0000 UTC" firstStartedPulling="2026-04-16 20:45:33.210940353 +0000 UTC m=+454.363036397" lastFinishedPulling="2026-04-16 20:45:35.759038009 +0000 UTC m=+456.911134040" observedRunningTime="2026-04-16 20:45:36.846648218 +0000 UTC m=+457.998744265" watchObservedRunningTime="2026-04-16 20:45:36.847948166 +0000 UTC m=+458.000044263" Apr 16 20:45:47.832573 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:47.832545 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf" Apr 16 20:45:49.868123 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.868087 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6"] Apr 16 20:45:49.872448 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.872430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:49.875413 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.875389 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:45:49.875548 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.875476 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7g5mm\"" Apr 16 20:45:49.876410 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.876385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:45:49.885748 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.885718 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6"] Apr 16 20:45:49.984267 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.984236 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:49.984411 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.984273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:49.984411 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:49.984349 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8cnx\" (UniqueName: \"kubernetes.io/projected/6b8d4df1-aaba-456e-b990-188b43159722-kube-api-access-m8cnx\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.084852 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.084822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.084980 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.084858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.084980 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.084896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8cnx\" (UniqueName: \"kubernetes.io/projected/6b8d4df1-aaba-456e-b990-188b43159722-kube-api-access-m8cnx\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.085215 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.085198 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.085253 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.085224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.092129 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.092105 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8cnx\" (UniqueName: \"kubernetes.io/projected/6b8d4df1-aaba-456e-b990-188b43159722-kube-api-access-m8cnx\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.181657 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.181601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:50.299937 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.299903 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6"] Apr 16 20:45:50.303270 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:45:50.303237 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8d4df1_aaba_456e_b990_188b43159722.slice/crio-491e6f1663446ac010936aed837842d452768ea19234609ab85a23404a43a2a3 WatchSource:0}: Error finding container 491e6f1663446ac010936aed837842d452768ea19234609ab85a23404a43a2a3: Status 404 returned error can't find the container with id 491e6f1663446ac010936aed837842d452768ea19234609ab85a23404a43a2a3 Apr 16 20:45:50.879876 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.879842 2575 generic.go:358] "Generic (PLEG): container finished" podID="6b8d4df1-aaba-456e-b990-188b43159722" containerID="bf4cd175088b3d1df9b9ece663a903059186d80858c1333dc08bbf9d26215fb5" exitCode=0 Apr 16 20:45:50.880284 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.879893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" event={"ID":"6b8d4df1-aaba-456e-b990-188b43159722","Type":"ContainerDied","Data":"bf4cd175088b3d1df9b9ece663a903059186d80858c1333dc08bbf9d26215fb5"} Apr 16 20:45:50.880284 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:50.879914 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" event={"ID":"6b8d4df1-aaba-456e-b990-188b43159722","Type":"ContainerStarted","Data":"491e6f1663446ac010936aed837842d452768ea19234609ab85a23404a43a2a3"} Apr 16 20:45:51.069744 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.069714 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-588879f674-jw4ml"] Apr 16 20:45:51.073909 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.073888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.076214 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.076185 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 20:45:51.076321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.076257 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 20:45:51.076396 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.076380 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-tlw27\"" Apr 16 20:45:51.076450 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.076385 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 20:45:51.076450 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.076442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 20:45:51.083721 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.083698 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-588879f674-jw4ml"] Apr 16 20:45:51.094045 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.094020 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w74sh\" (UniqueName: \"kubernetes.io/projected/1ea57899-46bc-4856-96b3-14087e5176ba-kube-api-access-w74sh\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.094161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.094062 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ea57899-46bc-4856-96b3-14087e5176ba-tls-certs\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.094161 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.094117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ea57899-46bc-4856-96b3-14087e5176ba-tmp\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.194655 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.194541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w74sh\" (UniqueName: \"kubernetes.io/projected/1ea57899-46bc-4856-96b3-14087e5176ba-kube-api-access-w74sh\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.194655 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.194581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ea57899-46bc-4856-96b3-14087e5176ba-tls-certs\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.194655 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.194633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ea57899-46bc-4856-96b3-14087e5176ba-tmp\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.196890 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.196865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1ea57899-46bc-4856-96b3-14087e5176ba-tmp\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.196995 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.196968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1ea57899-46bc-4856-96b3-14087e5176ba-tls-certs\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.203814 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.203794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w74sh\" (UniqueName: \"kubernetes.io/projected/1ea57899-46bc-4856-96b3-14087e5176ba-kube-api-access-w74sh\") pod \"kube-auth-proxy-588879f674-jw4ml\" (UID: \"1ea57899-46bc-4856-96b3-14087e5176ba\") " pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.384649 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.384606 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" Apr 16 20:45:51.499269 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.499096 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-588879f674-jw4ml"] Apr 16 20:45:51.501692 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:45:51.501666 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ea57899_46bc_4856_96b3_14087e5176ba.slice/crio-d13d7286734858da2e80102ffcbe86813cdcdbd02786e16c55f7202ba0b76def WatchSource:0}: Error finding container d13d7286734858da2e80102ffcbe86813cdcdbd02786e16c55f7202ba0b76def: Status 404 returned error can't find the container with id d13d7286734858da2e80102ffcbe86813cdcdbd02786e16c55f7202ba0b76def Apr 16 20:45:51.886915 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.886883 2575 generic.go:358] "Generic (PLEG): container finished" podID="6b8d4df1-aaba-456e-b990-188b43159722" containerID="38f0b8759ed7f749aaea01cdd92c1ec1f050cc5c0cf7680367977156c6b8ce87" exitCode=0 Apr 16 20:45:51.887397 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.886953 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" event={"ID":"6b8d4df1-aaba-456e-b990-188b43159722","Type":"ContainerDied","Data":"38f0b8759ed7f749aaea01cdd92c1ec1f050cc5c0cf7680367977156c6b8ce87"} Apr 16 20:45:51.888121 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:51.888097 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" event={"ID":"1ea57899-46bc-4856-96b3-14087e5176ba","Type":"ContainerStarted","Data":"d13d7286734858da2e80102ffcbe86813cdcdbd02786e16c55f7202ba0b76def"} Apr 16 20:45:52.894540 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:52.894494 2575 generic.go:358] "Generic (PLEG): container finished" podID="6b8d4df1-aaba-456e-b990-188b43159722" containerID="4a2c0bb0d472beb2d0e88661a60b54086f7f6b1191e6457965d0d5c247cc77a7" exitCode=0 Apr 16 20:45:52.895058 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:52.894543 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" event={"ID":"6b8d4df1-aaba-456e-b990-188b43159722","Type":"ContainerDied","Data":"4a2c0bb0d472beb2d0e88661a60b54086f7f6b1191e6457965d0d5c247cc77a7"} Apr 16 20:45:53.530501 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.530464 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-sqh8k"] Apr 16 20:45:53.533642 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.533610 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:53.535937 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.535915 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 20:45:53.535937 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.535929 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-4q4r8\"" Apr 16 20:45:53.544820 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.544791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-sqh8k"] Apr 16 20:45:53.609221 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.609197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c74987a-f7b9-4c73-be86-2ca4d965965e-cert\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:53.609324 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.609272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlcv\" (UniqueName: \"kubernetes.io/projected/0c74987a-f7b9-4c73-be86-2ca4d965965e-kube-api-access-hvlcv\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:53.710243 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.710217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlcv\" (UniqueName: \"kubernetes.io/projected/0c74987a-f7b9-4c73-be86-2ca4d965965e-kube-api-access-hvlcv\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:53.710367 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.710257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c74987a-f7b9-4c73-be86-2ca4d965965e-cert\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:53.710407 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:45:53.710386 2575 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 20:45:53.710459 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:45:53.710448 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c74987a-f7b9-4c73-be86-2ca4d965965e-cert podName:0c74987a-f7b9-4c73-be86-2ca4d965965e nodeName:}" failed. No retries permitted until 2026-04-16 20:45:54.210428327 +0000 UTC m=+475.362524358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0c74987a-f7b9-4c73-be86-2ca4d965965e-cert") pod "odh-model-controller-858dbf95b8-sqh8k" (UID: "0c74987a-f7b9-4c73-be86-2ca4d965965e") : secret "odh-model-controller-webhook-cert" not found Apr 16 20:45:53.726110 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:53.726089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlcv\" (UniqueName: \"kubernetes.io/projected/0c74987a-f7b9-4c73-be86-2ca4d965965e-kube-api-access-hvlcv\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:54.213243 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.213152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c74987a-f7b9-4c73-be86-2ca4d965965e-cert\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:54.215759 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.215722 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0c74987a-f7b9-4c73-be86-2ca4d965965e-cert\") pod \"odh-model-controller-858dbf95b8-sqh8k\" (UID: \"0c74987a-f7b9-4c73-be86-2ca4d965965e\") " pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:54.451154 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.451121 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:45:54.648760 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.648739 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:54.717330 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.717303 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-bundle\") pod \"6b8d4df1-aaba-456e-b990-188b43159722\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " Apr 16 20:45:54.717436 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.717357 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-util\") pod \"6b8d4df1-aaba-456e-b990-188b43159722\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " Apr 16 20:45:54.717436 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.717388 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8cnx\" (UniqueName: \"kubernetes.io/projected/6b8d4df1-aaba-456e-b990-188b43159722-kube-api-access-m8cnx\") pod \"6b8d4df1-aaba-456e-b990-188b43159722\" (UID: \"6b8d4df1-aaba-456e-b990-188b43159722\") " Apr 16 20:45:54.718230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.718205 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-bundle" (OuterVolumeSpecName: "bundle") pod "6b8d4df1-aaba-456e-b990-188b43159722" (UID: "6b8d4df1-aaba-456e-b990-188b43159722"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:45:54.719487 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.719462 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8d4df1-aaba-456e-b990-188b43159722-kube-api-access-m8cnx" (OuterVolumeSpecName: "kube-api-access-m8cnx") pod "6b8d4df1-aaba-456e-b990-188b43159722" (UID: "6b8d4df1-aaba-456e-b990-188b43159722"). InnerVolumeSpecName "kube-api-access-m8cnx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:45:54.723363 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.723337 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-util" (OuterVolumeSpecName: "util") pod "6b8d4df1-aaba-456e-b990-188b43159722" (UID: "6b8d4df1-aaba-456e-b990-188b43159722"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:45:54.753509 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.753489 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-sqh8k"] Apr 16 20:45:54.755345 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:45:54.755319 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c74987a_f7b9_4c73_be86_2ca4d965965e.slice/crio-9631dffff96e1263b353c546123de77cc9d582822e8b1d7ac1a55ce76bbd8abc WatchSource:0}: Error finding container 9631dffff96e1263b353c546123de77cc9d582822e8b1d7ac1a55ce76bbd8abc: Status 404 returned error can't find the container with id 9631dffff96e1263b353c546123de77cc9d582822e8b1d7ac1a55ce76bbd8abc Apr 16 20:45:54.818633 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.818589 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-bundle\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:45:54.818633 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.818633 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6b8d4df1-aaba-456e-b990-188b43159722-util\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:45:54.818794 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.818643 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8cnx\" (UniqueName: \"kubernetes.io/projected/6b8d4df1-aaba-456e-b990-188b43159722-kube-api-access-m8cnx\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:45:54.905384 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.905304 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" event={"ID":"6b8d4df1-aaba-456e-b990-188b43159722","Type":"ContainerDied","Data":"491e6f1663446ac010936aed837842d452768ea19234609ab85a23404a43a2a3"} Apr 16 20:45:54.905384 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.905338 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835d9nf6" Apr 16 20:45:54.905559 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.905339 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="491e6f1663446ac010936aed837842d452768ea19234609ab85a23404a43a2a3" Apr 16 20:45:54.906659 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.906631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" event={"ID":"1ea57899-46bc-4856-96b3-14087e5176ba","Type":"ContainerStarted","Data":"ef6b7c1e4851c93230fcd126b1fb6a84c04d6ccb7c1627219b31d2eeb593172a"} Apr 16 20:45:54.907672 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.907646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" event={"ID":"0c74987a-f7b9-4c73-be86-2ca4d965965e","Type":"ContainerStarted","Data":"9631dffff96e1263b353c546123de77cc9d582822e8b1d7ac1a55ce76bbd8abc"} Apr 16 20:45:54.922500 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:54.922461 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-588879f674-jw4ml" podStartSLOduration=0.734392005 podStartE2EDuration="3.922450127s" podCreationTimestamp="2026-04-16 20:45:51 +0000 UTC" firstStartedPulling="2026-04-16 20:45:51.503568688 +0000 UTC m=+472.655664719" lastFinishedPulling="2026-04-16 20:45:54.691626806 +0000 UTC m=+475.843722841" observedRunningTime="2026-04-16 20:45:54.921738989 +0000 UTC m=+476.073835260" watchObservedRunningTime="2026-04-16 20:45:54.922450127 +0000 UTC m=+476.074546180" Apr 16 20:45:57.920471 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:57.920430 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c74987a-f7b9-4c73-be86-2ca4d965965e" containerID="a8402741d187c44c6d0dc9a77ea8f7da3ef56a23349282260dd525e6c53df0ea" exitCode=1 Apr 16 20:45:57.920856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:57.920517 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" event={"ID":"0c74987a-f7b9-4c73-be86-2ca4d965965e","Type":"ContainerDied","Data":"a8402741d187c44c6d0dc9a77ea8f7da3ef56a23349282260dd525e6c53df0ea"} Apr 16 20:45:57.920856 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:57.920766 2575 scope.go:117] "RemoveContainer" containerID="a8402741d187c44c6d0dc9a77ea8f7da3ef56a23349282260dd525e6c53df0ea" Apr 16 20:45:58.925532 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:58.925490 2575 generic.go:358] "Generic (PLEG): container finished" podID="0c74987a-f7b9-4c73-be86-2ca4d965965e" containerID="4f62a12c09b3d5391f32db6b31a47791b97ce92fcc262c0fe1cbef5d7b7e89d6" exitCode=1 Apr 16 20:45:58.925952 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:58.925575 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" event={"ID":"0c74987a-f7b9-4c73-be86-2ca4d965965e","Type":"ContainerDied","Data":"4f62a12c09b3d5391f32db6b31a47791b97ce92fcc262c0fe1cbef5d7b7e89d6"} Apr 16 20:45:58.925952 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:58.925644 2575 scope.go:117] "RemoveContainer" containerID="a8402741d187c44c6d0dc9a77ea8f7da3ef56a23349282260dd525e6c53df0ea" Apr 16 20:45:58.925952 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:58.925802 2575 scope.go:117] "RemoveContainer" containerID="4f62a12c09b3d5391f32db6b31a47791b97ce92fcc262c0fe1cbef5d7b7e89d6" Apr 16 20:45:58.926104 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:45:58.926004 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-sqh8k_opendatahub(0c74987a-f7b9-4c73-be86-2ca4d965965e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" podUID="0c74987a-f7b9-4c73-be86-2ca4d965965e" Apr 16 20:45:59.262009 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.261938 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-6dwth"] Apr 16 20:45:59.262436 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262420 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="util" Apr 16 20:45:59.262488 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262439 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="util" Apr 16 20:45:59.262488 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262458 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="extract" Apr 16 20:45:59.262488 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262466 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="extract" Apr 16 20:45:59.262589 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262489 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="pull" Apr 16 20:45:59.262589 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262498 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="pull" Apr 16 20:45:59.262589 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.262585 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b8d4df1-aaba-456e-b990-188b43159722" containerName="extract" Apr 16 20:45:59.267045 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.267018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.269657 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.269633 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-zxcxx\"" Apr 16 20:45:59.269785 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.269702 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 20:45:59.282261 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.282241 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-6dwth"] Apr 16 20:45:59.355106 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.355078 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvq89\" (UniqueName: \"kubernetes.io/projected/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-kube-api-access-jvq89\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.355263 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.355114 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-cert\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.455567 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.455537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvq89\" (UniqueName: \"kubernetes.io/projected/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-kube-api-access-jvq89\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.455768 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.455586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-cert\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.455768 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:45:59.455711 2575 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 20:45:59.455882 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:45:59.455774 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-cert podName:3ae7c350-6f35-4b15-9616-e9b40f9a6c7f nodeName:}" failed. No retries permitted until 2026-04-16 20:45:59.955757449 +0000 UTC m=+481.107853481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-cert") pod "kserve-controller-manager-856948b99f-6dwth" (UID: "3ae7c350-6f35-4b15-9616-e9b40f9a6c7f") : secret "kserve-webhook-server-cert" not found Apr 16 20:45:59.469824 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.469791 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvq89\" (UniqueName: \"kubernetes.io/projected/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-kube-api-access-jvq89\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.930115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.930089 2575 scope.go:117] "RemoveContainer" containerID="4f62a12c09b3d5391f32db6b31a47791b97ce92fcc262c0fe1cbef5d7b7e89d6" Apr 16 20:45:59.930465 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:45:59.930283 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-sqh8k_opendatahub(0c74987a-f7b9-4c73-be86-2ca4d965965e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" podUID="0c74987a-f7b9-4c73-be86-2ca4d965965e" Apr 16 20:45:59.960673 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.960633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-cert\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:45:59.962939 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:45:59.962916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3ae7c350-6f35-4b15-9616-e9b40f9a6c7f-cert\") pod \"kserve-controller-manager-856948b99f-6dwth\" (UID: \"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f\") " pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:46:00.177203 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:00.177165 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:46:00.301465 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:00.301436 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-6dwth"] Apr 16 20:46:00.303368 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:46:00.303343 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ae7c350_6f35_4b15_9616_e9b40f9a6c7f.slice/crio-ac587eaa652c831ac2436da44bb8e782a38a7d55c6cd920ec10059ffa19ccacd WatchSource:0}: Error finding container ac587eaa652c831ac2436da44bb8e782a38a7d55c6cd920ec10059ffa19ccacd: Status 404 returned error can't find the container with id ac587eaa652c831ac2436da44bb8e782a38a7d55c6cd920ec10059ffa19ccacd Apr 16 20:46:00.934218 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:00.934184 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" event={"ID":"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f","Type":"ContainerStarted","Data":"ac587eaa652c831ac2436da44bb8e782a38a7d55c6cd920ec10059ffa19ccacd"} Apr 16 20:46:02.942110 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:02.942072 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" event={"ID":"3ae7c350-6f35-4b15-9616-e9b40f9a6c7f","Type":"ContainerStarted","Data":"ffc8301d7795dc6ab352c50d0d567a4cc3eaba466ef53b00bc27cb591a349471"} Apr 16 20:46:02.942457 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:02.942191 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:46:02.986232 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:02.986148 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" podStartSLOduration=1.5630000229999998 podStartE2EDuration="3.986136217s" podCreationTimestamp="2026-04-16 20:45:59 +0000 UTC" firstStartedPulling="2026-04-16 20:46:00.304752637 +0000 UTC m=+481.456848672" lastFinishedPulling="2026-04-16 20:46:02.727888835 +0000 UTC m=+483.879984866" observedRunningTime="2026-04-16 20:46:02.984171833 +0000 UTC m=+484.136267887" watchObservedRunningTime="2026-04-16 20:46:02.986136217 +0000 UTC m=+484.138232270" Apr 16 20:46:04.067581 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.067550 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc"] Apr 16 20:46:04.071029 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.071012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.074459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.074436 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:46:04.074459 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.074454 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:46:04.075407 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.075391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-7g5mm\"" Apr 16 20:46:04.086471 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.086450 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc"] Apr 16 20:46:04.095132 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.095110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.095250 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.095146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.095250 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.095196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txc8\" (UniqueName: \"kubernetes.io/projected/9e444b3b-343a-4a7b-bbfb-57333a869305-kube-api-access-9txc8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.195785 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.195754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.195935 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.195791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.195935 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.195819 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9txc8\" (UniqueName: \"kubernetes.io/projected/9e444b3b-343a-4a7b-bbfb-57333a869305-kube-api-access-9txc8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.196215 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.196191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.196252 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.196213 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.207415 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.207392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txc8\" (UniqueName: \"kubernetes.io/projected/9e444b3b-343a-4a7b-bbfb-57333a869305-kube-api-access-9txc8\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.380033 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.380002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:04.451491 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.451467 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:46:04.451908 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.451890 2575 scope.go:117] "RemoveContainer" containerID="4f62a12c09b3d5391f32db6b31a47791b97ce92fcc262c0fe1cbef5d7b7e89d6" Apr 16 20:46:04.452289 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:46:04.452181 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-sqh8k_opendatahub(0c74987a-f7b9-4c73-be86-2ca4d965965e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" podUID="0c74987a-f7b9-4c73-be86-2ca4d965965e" Apr 16 20:46:04.506715 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.506689 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc"] Apr 16 20:46:04.508454 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:46:04.508429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e444b3b_343a_4a7b_bbfb_57333a869305.slice/crio-fb02362bd0e36fd3afc53fec0e4dd6a3276b0fd5176f19ce2ac804ab83acc388 WatchSource:0}: Error finding container fb02362bd0e36fd3afc53fec0e4dd6a3276b0fd5176f19ce2ac804ab83acc388: Status 404 returned error can't find the container with id fb02362bd0e36fd3afc53fec0e4dd6a3276b0fd5176f19ce2ac804ab83acc388 Apr 16 20:46:04.949843 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.949810 2575 generic.go:358] "Generic (PLEG): container finished" podID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerID="18c23714255ab099a5f0ad5dc08ca5aa8721361b79c5a42ed83aae806305437e" exitCode=0 Apr 16 20:46:04.950009 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.949899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" event={"ID":"9e444b3b-343a-4a7b-bbfb-57333a869305","Type":"ContainerDied","Data":"18c23714255ab099a5f0ad5dc08ca5aa8721361b79c5a42ed83aae806305437e"} Apr 16 20:46:04.950009 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:04.949939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" event={"ID":"9e444b3b-343a-4a7b-bbfb-57333a869305","Type":"ContainerStarted","Data":"fb02362bd0e36fd3afc53fec0e4dd6a3276b0fd5176f19ce2ac804ab83acc388"} Apr 16 20:46:05.258498 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.258468 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7"] Apr 16 20:46:05.264313 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.264294 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.266786 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.266768 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 20:46:05.267206 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.267175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-vst2l\"" Apr 16 20:46:05.267350 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.267334 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 20:46:05.272848 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.272826 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7"] Apr 16 20:46:05.303769 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.303742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a2554baa-b481-44f0-96d3-a0f5d5127d81-operator-config\") pod \"servicemesh-operator3-55f49c5f94-l6lw7\" (UID: \"a2554baa-b481-44f0-96d3-a0f5d5127d81\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.303769 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.303771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfwqt\" (UniqueName: \"kubernetes.io/projected/a2554baa-b481-44f0-96d3-a0f5d5127d81-kube-api-access-tfwqt\") pod \"servicemesh-operator3-55f49c5f94-l6lw7\" (UID: \"a2554baa-b481-44f0-96d3-a0f5d5127d81\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.404269 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.404246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a2554baa-b481-44f0-96d3-a0f5d5127d81-operator-config\") pod \"servicemesh-operator3-55f49c5f94-l6lw7\" (UID: \"a2554baa-b481-44f0-96d3-a0f5d5127d81\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.404360 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.404278 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfwqt\" (UniqueName: \"kubernetes.io/projected/a2554baa-b481-44f0-96d3-a0f5d5127d81-kube-api-access-tfwqt\") pod \"servicemesh-operator3-55f49c5f94-l6lw7\" (UID: \"a2554baa-b481-44f0-96d3-a0f5d5127d81\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.406662 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.406642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a2554baa-b481-44f0-96d3-a0f5d5127d81-operator-config\") pod \"servicemesh-operator3-55f49c5f94-l6lw7\" (UID: \"a2554baa-b481-44f0-96d3-a0f5d5127d81\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.413296 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.413278 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfwqt\" (UniqueName: \"kubernetes.io/projected/a2554baa-b481-44f0-96d3-a0f5d5127d81-kube-api-access-tfwqt\") pod \"servicemesh-operator3-55f49c5f94-l6lw7\" (UID: \"a2554baa-b481-44f0-96d3-a0f5d5127d81\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.573956 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.573891 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:05.697330 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.697301 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7"] Apr 16 20:46:05.698604 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:46:05.698581 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2554baa_b481_44f0_96d3_a0f5d5127d81.slice/crio-f46b4cde0110192ab62cd272bb39570f29e3cc75e2cd5c33b741f24faf88abbe WatchSource:0}: Error finding container f46b4cde0110192ab62cd272bb39570f29e3cc75e2cd5c33b741f24faf88abbe: Status 404 returned error can't find the container with id f46b4cde0110192ab62cd272bb39570f29e3cc75e2cd5c33b741f24faf88abbe Apr 16 20:46:05.954357 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:05.954324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" event={"ID":"a2554baa-b481-44f0-96d3-a0f5d5127d81","Type":"ContainerStarted","Data":"f46b4cde0110192ab62cd272bb39570f29e3cc75e2cd5c33b741f24faf88abbe"} Apr 16 20:46:06.960863 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:06.960825 2575 generic.go:358] "Generic (PLEG): container finished" podID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerID="67bba6c6f774493cc1d44540e880ada638ed41838db16803d6a28cc0ee35912b" exitCode=0 Apr 16 20:46:06.961292 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:06.960880 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" event={"ID":"9e444b3b-343a-4a7b-bbfb-57333a869305","Type":"ContainerDied","Data":"67bba6c6f774493cc1d44540e880ada638ed41838db16803d6a28cc0ee35912b"} Apr 16 20:46:07.966787 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:07.966747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" event={"ID":"9e444b3b-343a-4a7b-bbfb-57333a869305","Type":"ContainerStarted","Data":"d5d50d4e0d265d593a784ddb6a21c805fc9a5d775133de9b01f9ec800536df50"} Apr 16 20:46:07.986992 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:07.986943 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" podStartSLOduration=2.942949862 podStartE2EDuration="3.98692914s" podCreationTimestamp="2026-04-16 20:46:04 +0000 UTC" firstStartedPulling="2026-04-16 20:46:04.950812845 +0000 UTC m=+486.102908876" lastFinishedPulling="2026-04-16 20:46:05.994792121 +0000 UTC m=+487.146888154" observedRunningTime="2026-04-16 20:46:07.98506777 +0000 UTC m=+489.137163825" watchObservedRunningTime="2026-04-16 20:46:07.98692914 +0000 UTC m=+489.139025408" Apr 16 20:46:08.972460 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:08.972425 2575 generic.go:358] "Generic (PLEG): container finished" podID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerID="d5d50d4e0d265d593a784ddb6a21c805fc9a5d775133de9b01f9ec800536df50" exitCode=0 Apr 16 20:46:08.972853 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:08.972505 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" event={"ID":"9e444b3b-343a-4a7b-bbfb-57333a869305","Type":"ContainerDied","Data":"d5d50d4e0d265d593a784ddb6a21c805fc9a5d775133de9b01f9ec800536df50"} Apr 16 20:46:08.973911 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:08.973885 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" event={"ID":"a2554baa-b481-44f0-96d3-a0f5d5127d81","Type":"ContainerStarted","Data":"da4d885b020563840b931c173e73fdfe7975d0053f6eeb14070c490daff494e5"} Apr 16 20:46:08.974017 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:08.973999 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:09.020951 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:09.020908 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" podStartSLOduration=1.693570652 podStartE2EDuration="4.020895955s" podCreationTimestamp="2026-04-16 20:46:05 +0000 UTC" firstStartedPulling="2026-04-16 20:46:05.700991049 +0000 UTC m=+486.853087080" lastFinishedPulling="2026-04-16 20:46:08.028316348 +0000 UTC m=+489.180412383" observedRunningTime="2026-04-16 20:46:09.019585933 +0000 UTC m=+490.171681999" watchObservedRunningTime="2026-04-16 20:46:09.020895955 +0000 UTC m=+490.172992008" Apr 16 20:46:10.097515 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.097492 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:10.143829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.143800 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-bundle\") pod \"9e444b3b-343a-4a7b-bbfb-57333a869305\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " Apr 16 20:46:10.143935 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.143887 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-util\") pod \"9e444b3b-343a-4a7b-bbfb-57333a869305\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " Apr 16 20:46:10.143935 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.143919 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txc8\" (UniqueName: \"kubernetes.io/projected/9e444b3b-343a-4a7b-bbfb-57333a869305-kube-api-access-9txc8\") pod \"9e444b3b-343a-4a7b-bbfb-57333a869305\" (UID: \"9e444b3b-343a-4a7b-bbfb-57333a869305\") " Apr 16 20:46:10.144736 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.144707 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-bundle" (OuterVolumeSpecName: "bundle") pod "9e444b3b-343a-4a7b-bbfb-57333a869305" (UID: "9e444b3b-343a-4a7b-bbfb-57333a869305"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:46:10.145863 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.145841 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e444b3b-343a-4a7b-bbfb-57333a869305-kube-api-access-9txc8" (OuterVolumeSpecName: "kube-api-access-9txc8") pod "9e444b3b-343a-4a7b-bbfb-57333a869305" (UID: "9e444b3b-343a-4a7b-bbfb-57333a869305"). InnerVolumeSpecName "kube-api-access-9txc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:46:10.150538 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.150515 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-util" (OuterVolumeSpecName: "util") pod "9e444b3b-343a-4a7b-bbfb-57333a869305" (UID: "9e444b3b-343a-4a7b-bbfb-57333a869305"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:46:10.245247 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.245196 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-util\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:46:10.245247 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.245217 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9txc8\" (UniqueName: \"kubernetes.io/projected/9e444b3b-343a-4a7b-bbfb-57333a869305-kube-api-access-9txc8\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:46:10.245247 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.245227 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9e444b3b-343a-4a7b-bbfb-57333a869305-bundle\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:46:10.982821 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.982778 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" event={"ID":"9e444b3b-343a-4a7b-bbfb-57333a869305","Type":"ContainerDied","Data":"fb02362bd0e36fd3afc53fec0e4dd6a3276b0fd5176f19ce2ac804ab83acc388"} Apr 16 20:46:10.982821 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.982813 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb02362bd0e36fd3afc53fec0e4dd6a3276b0fd5176f19ce2ac804ab83acc388" Apr 16 20:46:10.982821 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:10.982820 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2vgcjc" Apr 16 20:46:14.451325 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:14.451296 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:46:14.451711 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:14.451697 2575 scope.go:117] "RemoveContainer" containerID="4f62a12c09b3d5391f32db6b31a47791b97ce92fcc262c0fe1cbef5d7b7e89d6" Apr 16 20:46:15.000779 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:15.000751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" event={"ID":"0c74987a-f7b9-4c73-be86-2ca4d965965e","Type":"ContainerStarted","Data":"4430ac12a713db86acdd44e44aaf2b15d1b9604482622161aa351b7a45cff788"} Apr 16 20:46:15.000976 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:15.000957 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:46:15.018563 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:15.018518 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" podStartSLOduration=1.898322761 podStartE2EDuration="22.018505075s" podCreationTimestamp="2026-04-16 20:45:53 +0000 UTC" firstStartedPulling="2026-04-16 20:45:54.756793504 +0000 UTC m=+475.908889538" lastFinishedPulling="2026-04-16 20:46:14.876975818 +0000 UTC m=+496.029071852" observedRunningTime="2026-04-16 20:46:15.016860395 +0000 UTC m=+496.168956461" watchObservedRunningTime="2026-04-16 20:46:15.018505075 +0000 UTC m=+496.170601128" Apr 16 20:46:16.864372 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864338 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5"] Apr 16 20:46:16.864830 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864814 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="extract" Apr 16 20:46:16.864883 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864833 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="extract" Apr 16 20:46:16.864883 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864855 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="pull" Apr 16 20:46:16.864883 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864864 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="pull" Apr 16 20:46:16.864883 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864879 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="util" Apr 16 20:46:16.865044 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864888 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="util" Apr 16 20:46:16.865044 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.864960 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e444b3b-343a-4a7b-bbfb-57333a869305" containerName="extract" Apr 16 20:46:16.868299 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.868278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.870545 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.870521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 20:46:16.870545 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.870540 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 20:46:16.870717 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.870543 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 20:46:16.870717 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.870532 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 20:46:16.870717 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.870708 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-w75cj\"" Apr 16 20:46:16.878154 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.878130 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5"] Apr 16 20:46:16.902287 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e5e72258-f292-4870-9758-dbc28b87afc4-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.902406 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.902406 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.902406 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.902541 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e5e72258-f292-4870-9758-dbc28b87afc4-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.902541 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6sps\" (UniqueName: \"kubernetes.io/projected/e5e72258-f292-4870-9758-dbc28b87afc4-kube-api-access-g6sps\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:16.902633 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:16.902576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e5e72258-f292-4870-9758-dbc28b87afc4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003623 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003801 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003801 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003801 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003717 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e5e72258-f292-4870-9758-dbc28b87afc4-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003801 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003776 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6sps\" (UniqueName: \"kubernetes.io/projected/e5e72258-f292-4870-9758-dbc28b87afc4-kube-api-access-g6sps\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003821 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e5e72258-f292-4870-9758-dbc28b87afc4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.003999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.003859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e5e72258-f292-4870-9758-dbc28b87afc4-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.004539 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.004468 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/e5e72258-f292-4870-9758-dbc28b87afc4-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.006598 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.006571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/e5e72258-f292-4870-9758-dbc28b87afc4-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.006789 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.006757 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.006891 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.006823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.006943 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.006929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/e5e72258-f292-4870-9758-dbc28b87afc4-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.011358 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.011335 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/e5e72258-f292-4870-9758-dbc28b87afc4-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.012448 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.012427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6sps\" (UniqueName: \"kubernetes.io/projected/e5e72258-f292-4870-9758-dbc28b87afc4-kube-api-access-g6sps\") pod \"istiod-openshift-gateway-55ff986f96-prkw5\" (UID: \"e5e72258-f292-4870-9758-dbc28b87afc4\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.179284 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.179188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:17.347932 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:17.347897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5"] Apr 16 20:46:17.351456 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:46:17.351429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e72258_f292_4870_9758_dbc28b87afc4.slice/crio-b8a49c815c1227753028eda795942b8baf7c5e18213b406acad3c9ee25de8077 WatchSource:0}: Error finding container b8a49c815c1227753028eda795942b8baf7c5e18213b406acad3c9ee25de8077: Status 404 returned error can't find the container with id b8a49c815c1227753028eda795942b8baf7c5e18213b406acad3c9ee25de8077 Apr 16 20:46:18.012687 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:18.012649 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" event={"ID":"e5e72258-f292-4870-9758-dbc28b87afc4","Type":"ContainerStarted","Data":"b8a49c815c1227753028eda795942b8baf7c5e18213b406acad3c9ee25de8077"} Apr 16 20:46:19.980512 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:19.980487 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l6lw7" Apr 16 20:46:19.981645 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:19.981592 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:46:19.981757 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:19.981672 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 20:46:21.025499 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:21.025451 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" event={"ID":"e5e72258-f292-4870-9758-dbc28b87afc4","Type":"ContainerStarted","Data":"92c84205d613eed13c0fc9bbafc674cbb03ed5014e7d29769fab365f885c3bde"} Apr 16 20:46:21.026031 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:21.026012 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:21.027297 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:21.027229 2575 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-prkw5 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 20:46:21.027297 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:21.027280 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" podUID="e5e72258-f292-4870-9758-dbc28b87afc4" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:46:21.058913 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:21.058863 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" podStartSLOduration=2.431188578 podStartE2EDuration="5.058843917s" podCreationTimestamp="2026-04-16 20:46:16 +0000 UTC" firstStartedPulling="2026-04-16 20:46:17.353732479 +0000 UTC m=+498.505828511" lastFinishedPulling="2026-04-16 20:46:19.981387816 +0000 UTC m=+501.133483850" observedRunningTime="2026-04-16 20:46:21.05539669 +0000 UTC m=+502.207492744" watchObservedRunningTime="2026-04-16 20:46:21.058843917 +0000 UTC m=+502.210939971" Apr 16 20:46:22.029625 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:22.029589 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-prkw5" Apr 16 20:46:26.006167 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:26.006136 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-sqh8k" Apr 16 20:46:33.950519 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:46:33.950485 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-6dwth" Apr 16 20:47:26.317757 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.317722 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zbv5g"] Apr 16 20:47:26.320911 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.320894 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:26.323197 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.323165 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 20:47:26.323330 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.323193 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 20:47:26.323990 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.323971 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-zwhfp\"" Apr 16 20:47:26.333319 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.333295 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zbv5g"] Apr 16 20:47:26.371855 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.371822 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkjk\" (UniqueName: \"kubernetes.io/projected/eb8d4c4b-51c3-489f-9ec0-32637206d9e0-kube-api-access-vvkjk\") pod \"authorino-operator-657f44b778-zbv5g\" (UID: \"eb8d4c4b-51c3-489f-9ec0-32637206d9e0\") " pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:26.473057 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.473032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkjk\" (UniqueName: \"kubernetes.io/projected/eb8d4c4b-51c3-489f-9ec0-32637206d9e0-kube-api-access-vvkjk\") pod \"authorino-operator-657f44b778-zbv5g\" (UID: \"eb8d4c4b-51c3-489f-9ec0-32637206d9e0\") " pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:26.486379 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.486361 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkjk\" (UniqueName: \"kubernetes.io/projected/eb8d4c4b-51c3-489f-9ec0-32637206d9e0-kube-api-access-vvkjk\") pod \"authorino-operator-657f44b778-zbv5g\" (UID: \"eb8d4c4b-51c3-489f-9ec0-32637206d9e0\") " pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:26.637204 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.637177 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:26.775052 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:47:26.775026 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb8d4c4b_51c3_489f_9ec0_32637206d9e0.slice/crio-e0f646f3ecf5728575e47abc7e46eba24f7f5e40828406080fcea7d596958a14 WatchSource:0}: Error finding container e0f646f3ecf5728575e47abc7e46eba24f7f5e40828406080fcea7d596958a14: Status 404 returned error can't find the container with id e0f646f3ecf5728575e47abc7e46eba24f7f5e40828406080fcea7d596958a14 Apr 16 20:47:26.776514 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:26.776484 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-zbv5g"] Apr 16 20:47:27.251508 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:27.251473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" event={"ID":"eb8d4c4b-51c3-489f-9ec0-32637206d9e0","Type":"ContainerStarted","Data":"e0f646f3ecf5728575e47abc7e46eba24f7f5e40828406080fcea7d596958a14"} Apr 16 20:47:29.260139 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:29.260107 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" event={"ID":"eb8d4c4b-51c3-489f-9ec0-32637206d9e0","Type":"ContainerStarted","Data":"a6e01b5e264d1e4f3e033102d19418158ceffab96edb1b2640945ec162ec48ee"} Apr 16 20:47:29.260521 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:29.260242 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:29.278454 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:29.278404 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" podStartSLOduration=1.859901606 podStartE2EDuration="3.278389115s" podCreationTimestamp="2026-04-16 20:47:26 +0000 UTC" firstStartedPulling="2026-04-16 20:47:26.777062835 +0000 UTC m=+567.929158870" lastFinishedPulling="2026-04-16 20:47:28.195550338 +0000 UTC m=+569.347646379" observedRunningTime="2026-04-16 20:47:29.277718902 +0000 UTC m=+570.429814991" watchObservedRunningTime="2026-04-16 20:47:29.278389115 +0000 UTC m=+570.430485168" Apr 16 20:47:40.265909 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:40.265876 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-zbv5g" Apr 16 20:47:59.345565 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:59.345530 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:47:59.346353 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:47:59.346336 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:48:26.682361 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.682329 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:26.685708 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.685687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.687812 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.687790 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 20:48:26.687918 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.687873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c6jk2\"" Apr 16 20:48:26.692722 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.692695 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:26.753217 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.753187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/79e7354b-7da6-43ab-856c-0cffb59c5f7f-config-file\") pod \"limitador-limitador-7d549b5b-fqj4q\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.753375 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.753237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2pd\" (UniqueName: \"kubernetes.io/projected/79e7354b-7da6-43ab-856c-0cffb59c5f7f-kube-api-access-kd2pd\") pod \"limitador-limitador-7d549b5b-fqj4q\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.777112 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.777084 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:26.854631 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.854593 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/79e7354b-7da6-43ab-856c-0cffb59c5f7f-config-file\") pod \"limitador-limitador-7d549b5b-fqj4q\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.854793 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.854748 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2pd\" (UniqueName: \"kubernetes.io/projected/79e7354b-7da6-43ab-856c-0cffb59c5f7f-kube-api-access-kd2pd\") pod \"limitador-limitador-7d549b5b-fqj4q\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.855184 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.855165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/79e7354b-7da6-43ab-856c-0cffb59c5f7f-config-file\") pod \"limitador-limitador-7d549b5b-fqj4q\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.862823 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.862801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2pd\" (UniqueName: \"kubernetes.io/projected/79e7354b-7da6-43ab-856c-0cffb59c5f7f-kube-api-access-kd2pd\") pod \"limitador-limitador-7d549b5b-fqj4q\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:26.996910 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:26.996826 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:27.121968 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:27.121942 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:27.124027 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:48:27.123995 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79e7354b_7da6_43ab_856c_0cffb59c5f7f.slice/crio-33556fb769e009b693b38029c65653bc1a4cd2bd308147b533bdba7957550a74 WatchSource:0}: Error finding container 33556fb769e009b693b38029c65653bc1a4cd2bd308147b533bdba7957550a74: Status 404 returned error can't find the container with id 33556fb769e009b693b38029c65653bc1a4cd2bd308147b533bdba7957550a74 Apr 16 20:48:27.469330 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:27.469293 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" event={"ID":"79e7354b-7da6-43ab-856c-0cffb59c5f7f","Type":"ContainerStarted","Data":"33556fb769e009b693b38029c65653bc1a4cd2bd308147b533bdba7957550a74"} Apr 16 20:48:30.481460 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:30.481427 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" event={"ID":"79e7354b-7da6-43ab-856c-0cffb59c5f7f","Type":"ContainerStarted","Data":"a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042"} Apr 16 20:48:30.481931 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:30.481547 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:30.499587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:30.499544 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" podStartSLOduration=1.9313132560000001 podStartE2EDuration="4.499533539s" podCreationTimestamp="2026-04-16 20:48:26 +0000 UTC" firstStartedPulling="2026-04-16 20:48:27.125742747 +0000 UTC m=+628.277838777" lastFinishedPulling="2026-04-16 20:48:29.693963016 +0000 UTC m=+630.846059060" observedRunningTime="2026-04-16 20:48:30.497649937 +0000 UTC m=+631.649745995" watchObservedRunningTime="2026-04-16 20:48:30.499533539 +0000 UTC m=+631.651629654" Apr 16 20:48:41.485472 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:41.485439 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:42.622446 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:42.622412 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:42.622903 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:42.622605 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" podUID="79e7354b-7da6-43ab-856c-0cffb59c5f7f" containerName="limitador" containerID="cri-o://a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042" gracePeriod=30 Apr 16 20:48:43.175032 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.175006 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:43.286136 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.286038 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2pd\" (UniqueName: \"kubernetes.io/projected/79e7354b-7da6-43ab-856c-0cffb59c5f7f-kube-api-access-kd2pd\") pod \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " Apr 16 20:48:43.286136 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.286137 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/79e7354b-7da6-43ab-856c-0cffb59c5f7f-config-file\") pod \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\" (UID: \"79e7354b-7da6-43ab-856c-0cffb59c5f7f\") " Apr 16 20:48:43.286460 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.286440 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79e7354b-7da6-43ab-856c-0cffb59c5f7f-config-file" (OuterVolumeSpecName: "config-file") pod "79e7354b-7da6-43ab-856c-0cffb59c5f7f" (UID: "79e7354b-7da6-43ab-856c-0cffb59c5f7f"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:48:43.288228 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.288199 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e7354b-7da6-43ab-856c-0cffb59c5f7f-kube-api-access-kd2pd" (OuterVolumeSpecName: "kube-api-access-kd2pd") pod "79e7354b-7da6-43ab-856c-0cffb59c5f7f" (UID: "79e7354b-7da6-43ab-856c-0cffb59c5f7f"). InnerVolumeSpecName "kube-api-access-kd2pd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:48:43.386860 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.386812 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kd2pd\" (UniqueName: \"kubernetes.io/projected/79e7354b-7da6-43ab-856c-0cffb59c5f7f-kube-api-access-kd2pd\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:48:43.386860 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.386856 2575 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/79e7354b-7da6-43ab-856c-0cffb59c5f7f-config-file\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:48:43.524938 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.524907 2575 generic.go:358] "Generic (PLEG): container finished" podID="79e7354b-7da6-43ab-856c-0cffb59c5f7f" containerID="a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042" exitCode=0 Apr 16 20:48:43.525093 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.524977 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" Apr 16 20:48:43.525093 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.524987 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" event={"ID":"79e7354b-7da6-43ab-856c-0cffb59c5f7f","Type":"ContainerDied","Data":"a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042"} Apr 16 20:48:43.525093 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.525023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-fqj4q" event={"ID":"79e7354b-7da6-43ab-856c-0cffb59c5f7f","Type":"ContainerDied","Data":"33556fb769e009b693b38029c65653bc1a4cd2bd308147b533bdba7957550a74"} Apr 16 20:48:43.525093 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.525038 2575 scope.go:117] "RemoveContainer" containerID="a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042" Apr 16 20:48:43.533526 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.533507 2575 scope.go:117] "RemoveContainer" containerID="a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042" Apr 16 20:48:43.533778 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:48:43.533762 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042\": container with ID starting with a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042 not found: ID does not exist" containerID="a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042" Apr 16 20:48:43.533880 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.533785 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042"} err="failed to get container status \"a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042\": rpc error: code = NotFound desc = could not find container \"a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042\": container with ID starting with a11b0b2aabf3563729991630fa5c3e3ec744fd1b3b1a51f7d668e4d51199f042 not found: ID does not exist" Apr 16 20:48:43.541026 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.540978 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:43.544732 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:43.544708 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-fqj4q"] Apr 16 20:48:45.427118 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:45.427085 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e7354b-7da6-43ab-856c-0cffb59c5f7f" path="/var/lib/kubelet/pods/79e7354b-7da6-43ab-856c-0cffb59c5f7f/volumes" Apr 16 20:48:47.814164 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.814133 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-hzkrk"] Apr 16 20:48:47.814502 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.814481 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="79e7354b-7da6-43ab-856c-0cffb59c5f7f" containerName="limitador" Apr 16 20:48:47.814502 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.814492 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e7354b-7da6-43ab-856c-0cffb59c5f7f" containerName="limitador" Apr 16 20:48:47.814572 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.814564 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="79e7354b-7da6-43ab-856c-0cffb59c5f7f" containerName="limitador" Apr 16 20:48:47.819147 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.819130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:47.821887 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.821866 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-h929s\"" Apr 16 20:48:47.822061 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.821869 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 20:48:47.823841 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.823816 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-hzkrk"] Apr 16 20:48:47.925089 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.925052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84xqh\" (UniqueName: \"kubernetes.io/projected/39552f32-9c95-4b91-87d8-62e0cc489b30-kube-api-access-84xqh\") pod \"postgres-868db5846d-hzkrk\" (UID: \"39552f32-9c95-4b91-87d8-62e0cc489b30\") " pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:47.925246 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:47.925129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/39552f32-9c95-4b91-87d8-62e0cc489b30-data\") pod \"postgres-868db5846d-hzkrk\" (UID: \"39552f32-9c95-4b91-87d8-62e0cc489b30\") " pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:48.026404 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.026373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84xqh\" (UniqueName: \"kubernetes.io/projected/39552f32-9c95-4b91-87d8-62e0cc489b30-kube-api-access-84xqh\") pod \"postgres-868db5846d-hzkrk\" (UID: \"39552f32-9c95-4b91-87d8-62e0cc489b30\") " pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:48.026563 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.026451 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/39552f32-9c95-4b91-87d8-62e0cc489b30-data\") pod \"postgres-868db5846d-hzkrk\" (UID: \"39552f32-9c95-4b91-87d8-62e0cc489b30\") " pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:48.026820 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.026802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/39552f32-9c95-4b91-87d8-62e0cc489b30-data\") pod \"postgres-868db5846d-hzkrk\" (UID: \"39552f32-9c95-4b91-87d8-62e0cc489b30\") " pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:48.034690 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.034668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84xqh\" (UniqueName: \"kubernetes.io/projected/39552f32-9c95-4b91-87d8-62e0cc489b30-kube-api-access-84xqh\") pod \"postgres-868db5846d-hzkrk\" (UID: \"39552f32-9c95-4b91-87d8-62e0cc489b30\") " pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:48.133040 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.133014 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:48.256373 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.256347 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-hzkrk"] Apr 16 20:48:48.257400 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:48:48.257373 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39552f32_9c95_4b91_87d8_62e0cc489b30.slice/crio-0fedb7adc62f9f61d934c3133102cd356c7866e3cb4bad8f9d2f44a97b633fde WatchSource:0}: Error finding container 0fedb7adc62f9f61d934c3133102cd356c7866e3cb4bad8f9d2f44a97b633fde: Status 404 returned error can't find the container with id 0fedb7adc62f9f61d934c3133102cd356c7866e3cb4bad8f9d2f44a97b633fde Apr 16 20:48:48.546011 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:48.545924 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-hzkrk" event={"ID":"39552f32-9c95-4b91-87d8-62e0cc489b30","Type":"ContainerStarted","Data":"0fedb7adc62f9f61d934c3133102cd356c7866e3cb4bad8f9d2f44a97b633fde"} Apr 16 20:48:53.566103 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:53.566065 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-hzkrk" event={"ID":"39552f32-9c95-4b91-87d8-62e0cc489b30","Type":"ContainerStarted","Data":"7553b93838a9c047a68e1a40765468c9f5fe15c46749cebed06bd4b33750b5ad"} Apr 16 20:48:53.566502 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:53.566197 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:48:53.581810 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:53.581765 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-hzkrk" podStartSLOduration=1.941687771 podStartE2EDuration="6.5817537s" podCreationTimestamp="2026-04-16 20:48:47 +0000 UTC" firstStartedPulling="2026-04-16 20:48:48.258641383 +0000 UTC m=+649.410737415" lastFinishedPulling="2026-04-16 20:48:52.898707298 +0000 UTC m=+654.050803344" observedRunningTime="2026-04-16 20:48:53.579139104 +0000 UTC m=+654.731235157" watchObservedRunningTime="2026-04-16 20:48:53.5817537 +0000 UTC m=+654.733849772" Apr 16 20:48:59.598458 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:48:59.598429 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-hzkrk" Apr 16 20:49:00.562416 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.562387 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7dd6f68878-vwf72"] Apr 16 20:49:00.568751 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.568725 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:00.573230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.573205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-nj46q\"" Apr 16 20:49:00.573230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.573219 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 20:49:00.573230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.573229 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 20:49:00.583469 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.583448 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7dd6f68878-vwf72"] Apr 16 20:49:00.603407 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.603378 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66f6fc69c-hzfqq"] Apr 16 20:49:00.607120 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.607097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:00.609683 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.609664 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-c92v8\"" Apr 16 20:49:00.619792 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.619771 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66f6fc69c-hzfqq"] Apr 16 20:49:00.639493 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.639469 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c5f\" (UniqueName: \"kubernetes.io/projected/d00e39ac-5117-458a-ad5d-0347a6726c5e-kube-api-access-b6c5f\") pod \"maas-controller-66f6fc69c-hzfqq\" (UID: \"d00e39ac-5117-458a-ad5d-0347a6726c5e\") " pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:00.740305 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.740268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c5f\" (UniqueName: \"kubernetes.io/projected/d00e39ac-5117-458a-ad5d-0347a6726c5e-kube-api-access-b6c5f\") pod \"maas-controller-66f6fc69c-hzfqq\" (UID: \"d00e39ac-5117-458a-ad5d-0347a6726c5e\") " pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:00.740455 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.740341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:00.740455 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.740393 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5jbh\" (UniqueName: \"kubernetes.io/projected/1615a55e-a03a-486b-a159-640068aa7bda-kube-api-access-s5jbh\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:00.748509 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.748484 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c5f\" (UniqueName: \"kubernetes.io/projected/d00e39ac-5117-458a-ad5d-0347a6726c5e-kube-api-access-b6c5f\") pod \"maas-controller-66f6fc69c-hzfqq\" (UID: \"d00e39ac-5117-458a-ad5d-0347a6726c5e\") " pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:00.841478 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.841454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5jbh\" (UniqueName: \"kubernetes.io/projected/1615a55e-a03a-486b-a159-640068aa7bda-kube-api-access-s5jbh\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:00.841595 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.841510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:00.841669 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:49:00.841593 2575 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 16 20:49:00.841710 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:49:00.841679 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls podName:1615a55e-a03a-486b-a159-640068aa7bda nodeName:}" failed. No retries permitted until 2026-04-16 20:49:01.341660914 +0000 UTC m=+662.493756953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls") pod "maas-api-7dd6f68878-vwf72" (UID: "1615a55e-a03a-486b-a159-640068aa7bda") : secret "maas-api-serving-cert" not found Apr 16 20:49:00.858548 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.858524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5jbh\" (UniqueName: \"kubernetes.io/projected/1615a55e-a03a-486b-a159-640068aa7bda-kube-api-access-s5jbh\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:00.917959 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:00.917934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:01.042632 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.042590 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66f6fc69c-hzfqq"] Apr 16 20:49:01.044599 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:49:01.044564 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00e39ac_5117_458a_ad5d_0347a6726c5e.slice/crio-7abfe097ebbc236f03f12d1d2a6b6b6769d9d8763f778cfbab60244e323bdf99 WatchSource:0}: Error finding container 7abfe097ebbc236f03f12d1d2a6b6b6769d9d8763f778cfbab60244e323bdf99: Status 404 returned error can't find the container with id 7abfe097ebbc236f03f12d1d2a6b6b6769d9d8763f778cfbab60244e323bdf99 Apr 16 20:49:01.345825 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.345788 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:01.348054 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.348036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls\") pod \"maas-api-7dd6f68878-vwf72\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:01.432346 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.432313 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5c54fc4f76-htwxx"] Apr 16 20:49:01.437190 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.437168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.442974 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.442950 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5c54fc4f76-htwxx"] Apr 16 20:49:01.478808 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.478784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:01.547077 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.547046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9srn\" (UniqueName: \"kubernetes.io/projected/c7c8855e-db5b-4727-a24f-5f2d3a6e416e-kube-api-access-r9srn\") pod \"maas-api-5c54fc4f76-htwxx\" (UID: \"c7c8855e-db5b-4727-a24f-5f2d3a6e416e\") " pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.547298 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.547240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c7c8855e-db5b-4727-a24f-5f2d3a6e416e-maas-api-tls\") pod \"maas-api-5c54fc4f76-htwxx\" (UID: \"c7c8855e-db5b-4727-a24f-5f2d3a6e416e\") " pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.594981 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.594932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" event={"ID":"d00e39ac-5117-458a-ad5d-0347a6726c5e","Type":"ContainerStarted","Data":"7abfe097ebbc236f03f12d1d2a6b6b6769d9d8763f778cfbab60244e323bdf99"} Apr 16 20:49:01.604860 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.604834 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7dd6f68878-vwf72"] Apr 16 20:49:01.607028 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:49:01.607003 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1615a55e_a03a_486b_a159_640068aa7bda.slice/crio-43fd0f07a353bf525fee6c03b8e901f609c4cc3df03d9703f9b5ed336bb4a759 WatchSource:0}: Error finding container 43fd0f07a353bf525fee6c03b8e901f609c4cc3df03d9703f9b5ed336bb4a759: Status 404 returned error can't find the container with id 43fd0f07a353bf525fee6c03b8e901f609c4cc3df03d9703f9b5ed336bb4a759 Apr 16 20:49:01.648290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.648258 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c7c8855e-db5b-4727-a24f-5f2d3a6e416e-maas-api-tls\") pod \"maas-api-5c54fc4f76-htwxx\" (UID: \"c7c8855e-db5b-4727-a24f-5f2d3a6e416e\") " pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.648670 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.648643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9srn\" (UniqueName: \"kubernetes.io/projected/c7c8855e-db5b-4727-a24f-5f2d3a6e416e-kube-api-access-r9srn\") pod \"maas-api-5c54fc4f76-htwxx\" (UID: \"c7c8855e-db5b-4727-a24f-5f2d3a6e416e\") " pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.653125 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.653101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c7c8855e-db5b-4727-a24f-5f2d3a6e416e-maas-api-tls\") pod \"maas-api-5c54fc4f76-htwxx\" (UID: \"c7c8855e-db5b-4727-a24f-5f2d3a6e416e\") " pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.656939 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.656914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9srn\" (UniqueName: \"kubernetes.io/projected/c7c8855e-db5b-4727-a24f-5f2d3a6e416e-kube-api-access-r9srn\") pod \"maas-api-5c54fc4f76-htwxx\" (UID: \"c7c8855e-db5b-4727-a24f-5f2d3a6e416e\") " pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.749487 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.749448 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:01.896422 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:01.896395 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5c54fc4f76-htwxx"] Apr 16 20:49:01.899566 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:49:01.899534 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c8855e_db5b_4727_a24f_5f2d3a6e416e.slice/crio-7ef78c06361d530356f6c8a2ae4d791dc9e30f158eedf25cbe55434358718d25 WatchSource:0}: Error finding container 7ef78c06361d530356f6c8a2ae4d791dc9e30f158eedf25cbe55434358718d25: Status 404 returned error can't find the container with id 7ef78c06361d530356f6c8a2ae4d791dc9e30f158eedf25cbe55434358718d25 Apr 16 20:49:02.602640 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:02.602581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dd6f68878-vwf72" event={"ID":"1615a55e-a03a-486b-a159-640068aa7bda","Type":"ContainerStarted","Data":"43fd0f07a353bf525fee6c03b8e901f609c4cc3df03d9703f9b5ed336bb4a759"} Apr 16 20:49:02.606021 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:02.605992 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5c54fc4f76-htwxx" event={"ID":"c7c8855e-db5b-4727-a24f-5f2d3a6e416e","Type":"ContainerStarted","Data":"7ef78c06361d530356f6c8a2ae4d791dc9e30f158eedf25cbe55434358718d25"} Apr 16 20:49:04.616558 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:04.616528 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dd6f68878-vwf72" event={"ID":"1615a55e-a03a-486b-a159-640068aa7bda","Type":"ContainerStarted","Data":"85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677"} Apr 16 20:49:04.616892 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:04.616658 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:04.632264 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:04.632224 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7dd6f68878-vwf72" podStartSLOduration=1.748660891 podStartE2EDuration="4.632207977s" podCreationTimestamp="2026-04-16 20:49:00 +0000 UTC" firstStartedPulling="2026-04-16 20:49:01.608312687 +0000 UTC m=+662.760408718" lastFinishedPulling="2026-04-16 20:49:04.49185976 +0000 UTC m=+665.643955804" observedRunningTime="2026-04-16 20:49:04.630933112 +0000 UTC m=+665.783029166" watchObservedRunningTime="2026-04-16 20:49:04.632207977 +0000 UTC m=+665.784304030" Apr 16 20:49:05.621130 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:05.621086 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" event={"ID":"d00e39ac-5117-458a-ad5d-0347a6726c5e","Type":"ContainerStarted","Data":"d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae"} Apr 16 20:49:05.621568 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:05.621309 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:05.622509 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:05.622483 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5c54fc4f76-htwxx" event={"ID":"c7c8855e-db5b-4727-a24f-5f2d3a6e416e","Type":"ContainerStarted","Data":"d5a1e6b8f3c6b9ca14b3ed08062ee760b9633fdc6ae02fdf76547980b7f75002"} Apr 16 20:49:05.622639 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:05.622600 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:05.640129 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:05.640092 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" podStartSLOduration=2.193686681 podStartE2EDuration="5.640081637s" podCreationTimestamp="2026-04-16 20:49:00 +0000 UTC" firstStartedPulling="2026-04-16 20:49:01.046023376 +0000 UTC m=+662.198119407" lastFinishedPulling="2026-04-16 20:49:04.492418332 +0000 UTC m=+665.644514363" observedRunningTime="2026-04-16 20:49:05.638470944 +0000 UTC m=+666.790566997" watchObservedRunningTime="2026-04-16 20:49:05.640081637 +0000 UTC m=+666.792177730" Apr 16 20:49:05.654625 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:05.654582 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5c54fc4f76-htwxx" podStartSLOduration=2.057741225 podStartE2EDuration="4.654570586s" podCreationTimestamp="2026-04-16 20:49:01 +0000 UTC" firstStartedPulling="2026-04-16 20:49:01.901500364 +0000 UTC m=+663.053596409" lastFinishedPulling="2026-04-16 20:49:04.498329738 +0000 UTC m=+665.650425770" observedRunningTime="2026-04-16 20:49:05.652581558 +0000 UTC m=+666.804677612" watchObservedRunningTime="2026-04-16 20:49:05.654570586 +0000 UTC m=+666.806666640" Apr 16 20:49:10.627770 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:10.627742 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:11.633692 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.633663 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5c54fc4f76-htwxx" Apr 16 20:49:11.677434 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.677405 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7dd6f68878-vwf72"] Apr 16 20:49:11.677680 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.677633 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-7dd6f68878-vwf72" podUID="1615a55e-a03a-486b-a159-640068aa7bda" containerName="maas-api" containerID="cri-o://85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677" gracePeriod=30 Apr 16 20:49:11.913992 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.913968 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:11.936182 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.936152 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls\") pod \"1615a55e-a03a-486b-a159-640068aa7bda\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " Apr 16 20:49:11.936318 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.936240 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5jbh\" (UniqueName: \"kubernetes.io/projected/1615a55e-a03a-486b-a159-640068aa7bda-kube-api-access-s5jbh\") pod \"1615a55e-a03a-486b-a159-640068aa7bda\" (UID: \"1615a55e-a03a-486b-a159-640068aa7bda\") " Apr 16 20:49:11.938224 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.938190 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "1615a55e-a03a-486b-a159-640068aa7bda" (UID: "1615a55e-a03a-486b-a159-640068aa7bda"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:49:11.938338 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:11.938315 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1615a55e-a03a-486b-a159-640068aa7bda-kube-api-access-s5jbh" (OuterVolumeSpecName: "kube-api-access-s5jbh") pod "1615a55e-a03a-486b-a159-640068aa7bda" (UID: "1615a55e-a03a-486b-a159-640068aa7bda"). InnerVolumeSpecName "kube-api-access-s5jbh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:49:12.037000 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.036971 2575 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/1615a55e-a03a-486b-a159-640068aa7bda-maas-api-tls\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:49:12.037000 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.036997 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5jbh\" (UniqueName: \"kubernetes.io/projected/1615a55e-a03a-486b-a159-640068aa7bda-kube-api-access-s5jbh\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:49:12.649359 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.649321 2575 generic.go:358] "Generic (PLEG): container finished" podID="1615a55e-a03a-486b-a159-640068aa7bda" containerID="85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677" exitCode=0 Apr 16 20:49:12.649790 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.649389 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7dd6f68878-vwf72" Apr 16 20:49:12.649790 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.649401 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dd6f68878-vwf72" event={"ID":"1615a55e-a03a-486b-a159-640068aa7bda","Type":"ContainerDied","Data":"85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677"} Apr 16 20:49:12.649790 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.649438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7dd6f68878-vwf72" event={"ID":"1615a55e-a03a-486b-a159-640068aa7bda","Type":"ContainerDied","Data":"43fd0f07a353bf525fee6c03b8e901f609c4cc3df03d9703f9b5ed336bb4a759"} Apr 16 20:49:12.649790 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.649453 2575 scope.go:117] "RemoveContainer" containerID="85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677" Apr 16 20:49:12.658027 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.658010 2575 scope.go:117] "RemoveContainer" containerID="85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677" Apr 16 20:49:12.658296 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:49:12.658277 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677\": container with ID starting with 85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677 not found: ID does not exist" containerID="85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677" Apr 16 20:49:12.658349 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.658305 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677"} err="failed to get container status \"85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677\": rpc error: code = NotFound desc = could not find container \"85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677\": container with ID starting with 85f0787b3926c85cece7515a8fc639265b2c959ea0b74548df45ddea36727677 not found: ID does not exist" Apr 16 20:49:12.669975 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.669952 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-7dd6f68878-vwf72"] Apr 16 20:49:12.673208 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:12.673189 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-7dd6f68878-vwf72"] Apr 16 20:49:13.426652 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:13.426605 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1615a55e-a03a-486b-a159-640068aa7bda" path="/var/lib/kubelet/pods/1615a55e-a03a-486b-a159-640068aa7bda/volumes" Apr 16 20:49:16.633505 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.633477 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:16.926707 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.926627 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-764b68564c-6rs2g"] Apr 16 20:49:16.927051 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.927034 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1615a55e-a03a-486b-a159-640068aa7bda" containerName="maas-api" Apr 16 20:49:16.927051 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.927053 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1615a55e-a03a-486b-a159-640068aa7bda" containerName="maas-api" Apr 16 20:49:16.927202 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.927166 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1615a55e-a03a-486b-a159-640068aa7bda" containerName="maas-api" Apr 16 20:49:16.931385 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.931370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:16.936491 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.936472 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-764b68564c-6rs2g"] Apr 16 20:49:16.978933 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:16.978904 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtf94\" (UniqueName: \"kubernetes.io/projected/e7ae7521-5349-4ffe-ae75-31b325bb8640-kube-api-access-wtf94\") pod \"maas-controller-764b68564c-6rs2g\" (UID: \"e7ae7521-5349-4ffe-ae75-31b325bb8640\") " pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:17.079283 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:17.079257 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtf94\" (UniqueName: \"kubernetes.io/projected/e7ae7521-5349-4ffe-ae75-31b325bb8640-kube-api-access-wtf94\") pod \"maas-controller-764b68564c-6rs2g\" (UID: \"e7ae7521-5349-4ffe-ae75-31b325bb8640\") " pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:17.087719 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:17.087689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtf94\" (UniqueName: \"kubernetes.io/projected/e7ae7521-5349-4ffe-ae75-31b325bb8640-kube-api-access-wtf94\") pod \"maas-controller-764b68564c-6rs2g\" (UID: \"e7ae7521-5349-4ffe-ae75-31b325bb8640\") " pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:17.242651 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:17.242557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:17.570801 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:17.570776 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-764b68564c-6rs2g"] Apr 16 20:49:17.572785 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:49:17.572757 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7ae7521_5349_4ffe_ae75_31b325bb8640.slice/crio-2214378990a607ffeaef5e07781543f38b2564eb860f89e3311c103878be9510 WatchSource:0}: Error finding container 2214378990a607ffeaef5e07781543f38b2564eb860f89e3311c103878be9510: Status 404 returned error can't find the container with id 2214378990a607ffeaef5e07781543f38b2564eb860f89e3311c103878be9510 Apr 16 20:49:17.574453 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:17.574435 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:49:17.669941 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:17.669909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-764b68564c-6rs2g" event={"ID":"e7ae7521-5349-4ffe-ae75-31b325bb8640","Type":"ContainerStarted","Data":"2214378990a607ffeaef5e07781543f38b2564eb860f89e3311c103878be9510"} Apr 16 20:49:18.674319 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:18.674283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-764b68564c-6rs2g" event={"ID":"e7ae7521-5349-4ffe-ae75-31b325bb8640","Type":"ContainerStarted","Data":"9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb"} Apr 16 20:49:18.674797 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:18.674415 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:18.691342 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:18.691292 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-764b68564c-6rs2g" podStartSLOduration=2.367811072 podStartE2EDuration="2.691277688s" podCreationTimestamp="2026-04-16 20:49:16 +0000 UTC" firstStartedPulling="2026-04-16 20:49:17.57455562 +0000 UTC m=+678.726651652" lastFinishedPulling="2026-04-16 20:49:17.898022223 +0000 UTC m=+679.050118268" observedRunningTime="2026-04-16 20:49:18.688589353 +0000 UTC m=+679.840685405" watchObservedRunningTime="2026-04-16 20:49:18.691277688 +0000 UTC m=+679.843373740" Apr 16 20:49:29.684170 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:29.684137 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:49:29.721643 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:29.721602 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66f6fc69c-hzfqq"] Apr 16 20:49:29.721850 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:29.721813 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" podUID="d00e39ac-5117-458a-ad5d-0347a6726c5e" containerName="manager" containerID="cri-o://d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae" gracePeriod=10 Apr 16 20:49:29.963115 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:29.963094 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:30.087587 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.087558 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6c5f\" (UniqueName: \"kubernetes.io/projected/d00e39ac-5117-458a-ad5d-0347a6726c5e-kube-api-access-b6c5f\") pod \"d00e39ac-5117-458a-ad5d-0347a6726c5e\" (UID: \"d00e39ac-5117-458a-ad5d-0347a6726c5e\") " Apr 16 20:49:30.089633 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.089590 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00e39ac-5117-458a-ad5d-0347a6726c5e-kube-api-access-b6c5f" (OuterVolumeSpecName: "kube-api-access-b6c5f") pod "d00e39ac-5117-458a-ad5d-0347a6726c5e" (UID: "d00e39ac-5117-458a-ad5d-0347a6726c5e"). InnerVolumeSpecName "kube-api-access-b6c5f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:49:30.189234 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.189205 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6c5f\" (UniqueName: \"kubernetes.io/projected/d00e39ac-5117-458a-ad5d-0347a6726c5e-kube-api-access-b6c5f\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:49:30.719522 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.719491 2575 generic.go:358] "Generic (PLEG): container finished" podID="d00e39ac-5117-458a-ad5d-0347a6726c5e" containerID="d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae" exitCode=0 Apr 16 20:49:30.719929 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.719537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" event={"ID":"d00e39ac-5117-458a-ad5d-0347a6726c5e","Type":"ContainerDied","Data":"d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae"} Apr 16 20:49:30.719929 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.719550 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" Apr 16 20:49:30.719929 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.719559 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66f6fc69c-hzfqq" event={"ID":"d00e39ac-5117-458a-ad5d-0347a6726c5e","Type":"ContainerDied","Data":"7abfe097ebbc236f03f12d1d2a6b6b6769d9d8763f778cfbab60244e323bdf99"} Apr 16 20:49:30.719929 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.719573 2575 scope.go:117] "RemoveContainer" containerID="d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae" Apr 16 20:49:30.728256 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.728054 2575 scope.go:117] "RemoveContainer" containerID="d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae" Apr 16 20:49:30.728307 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:49:30.728287 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae\": container with ID starting with d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae not found: ID does not exist" containerID="d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae" Apr 16 20:49:30.728341 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.728308 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae"} err="failed to get container status \"d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae\": rpc error: code = NotFound desc = could not find container \"d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae\": container with ID starting with d965a252dd65d1e59c1e38a298f518e5e75e2c9c396b167ac1016c343818a6ae not found: ID does not exist" Apr 16 20:49:30.743759 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.743735 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66f6fc69c-hzfqq"] Apr 16 20:49:30.748777 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:30.748756 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-66f6fc69c-hzfqq"] Apr 16 20:49:31.426904 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:31.426871 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00e39ac-5117-458a-ad5d-0347a6726c5e" path="/var/lib/kubelet/pods/d00e39ac-5117-458a-ad5d-0347a6726c5e/volumes" Apr 16 20:49:43.764533 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.764500 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl"] Apr 16 20:49:43.765077 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.765055 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d00e39ac-5117-458a-ad5d-0347a6726c5e" containerName="manager" Apr 16 20:49:43.765154 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.765080 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00e39ac-5117-458a-ad5d-0347a6726c5e" containerName="manager" Apr 16 20:49:43.765208 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.765185 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d00e39ac-5117-458a-ad5d-0347a6726c5e" containerName="manager" Apr 16 20:49:43.768711 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.768688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:43.771198 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.771175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 20:49:43.771321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.771225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 20:49:43.771321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.771263 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 20:49:43.771321 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.771310 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-fbmqf\"" Apr 16 20:49:43.775420 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.775399 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl"] Apr 16 20:49:43.905666 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.905632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbsp\" (UniqueName: \"kubernetes.io/projected/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-kube-api-access-8vbsp\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:43.905818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.905688 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:43.905818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.905718 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:43.905818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.905759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:43.905818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.905788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:43.905818 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:43.905815 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.006787 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.006757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbsp\" (UniqueName: \"kubernetes.io/projected/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-kube-api-access-8vbsp\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.006949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.006798 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.006949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.006815 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.006949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.006842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.006949 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.006861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.007173 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.006971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.007230 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.007215 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.007290 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.007270 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.007349 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.007328 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.009150 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.009124 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.009276 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.009261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.017699 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.017641 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbsp\" (UniqueName: \"kubernetes.io/projected/bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34-kube-api-access-8vbsp\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl\" (UID: \"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.079346 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.079319 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:49:44.206812 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.206788 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl"] Apr 16 20:49:44.209431 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:49:44.209394 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf8aaf8d_3bcc_4bbf_a159_5cfc08bd1b34.slice/crio-dbf0e927600618e1c43c9347d8f07cc3fbb254e1ef5d04c4d9924805bf3423f0 WatchSource:0}: Error finding container dbf0e927600618e1c43c9347d8f07cc3fbb254e1ef5d04c4d9924805bf3423f0: Status 404 returned error can't find the container with id dbf0e927600618e1c43c9347d8f07cc3fbb254e1ef5d04c4d9924805bf3423f0 Apr 16 20:49:44.769638 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:44.769586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" event={"ID":"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34","Type":"ContainerStarted","Data":"dbf0e927600618e1c43c9347d8f07cc3fbb254e1ef5d04c4d9924805bf3423f0"} Apr 16 20:49:50.793669 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:50.793582 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" event={"ID":"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34","Type":"ContainerStarted","Data":"d269c041187f17142a7f398e8dcdd56bacc64d11ed080e651f3a102dcfac850e"} Apr 16 20:49:55.141912 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.141873 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp"] Apr 16 20:49:55.144655 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.144635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.147009 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.146989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 20:49:55.154021 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.154000 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp"] Apr 16 20:49:55.311888 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.311850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da2c7c48-b450-456e-9eda-94e1fb15a151-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.312080 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.311909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmj7t\" (UniqueName: \"kubernetes.io/projected/da2c7c48-b450-456e-9eda-94e1fb15a151-kube-api-access-hmj7t\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.312080 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.312007 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.312080 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.312058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.312238 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.312134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.312238 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.312176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413525 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413525 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413525 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da2c7c48-b450-456e-9eda-94e1fb15a151-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmj7t\" (UniqueName: \"kubernetes.io/projected/da2c7c48-b450-456e-9eda-94e1fb15a151-kube-api-access-hmj7t\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413562 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413583 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413984 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413924 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.413984 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.414059 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.413985 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.415827 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.415807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da2c7c48-b450-456e-9eda-94e1fb15a151-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.416196 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.416177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da2c7c48-b450-456e-9eda-94e1fb15a151-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.421265 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.421244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmj7t\" (UniqueName: \"kubernetes.io/projected/da2c7c48-b450-456e-9eda-94e1fb15a151-kube-api-access-hmj7t\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-85tcp\" (UID: \"da2c7c48-b450-456e-9eda-94e1fb15a151\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.454997 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.454971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:49:55.819946 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:55.819872 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp"] Apr 16 20:49:55.823991 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:49:55.823963 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda2c7c48_b450_456e_9eda_94e1fb15a151.slice/crio-fb466b7d41734f4b8d4483d6eae6d525c6643386205cb90cc5852a8625bb85ac WatchSource:0}: Error finding container fb466b7d41734f4b8d4483d6eae6d525c6643386205cb90cc5852a8625bb85ac: Status 404 returned error can't find the container with id fb466b7d41734f4b8d4483d6eae6d525c6643386205cb90cc5852a8625bb85ac Apr 16 20:49:56.821189 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:56.821102 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" event={"ID":"da2c7c48-b450-456e-9eda-94e1fb15a151","Type":"ContainerStarted","Data":"a616f01459c11f126c0352bea4612ae1bd18525a027b8a61592a1f7c035c0a26"} Apr 16 20:49:56.821189 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:56.821137 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" event={"ID":"da2c7c48-b450-456e-9eda-94e1fb15a151","Type":"ContainerStarted","Data":"fb466b7d41734f4b8d4483d6eae6d525c6643386205cb90cc5852a8625bb85ac"} Apr 16 20:49:56.822432 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:56.822405 2575 generic.go:358] "Generic (PLEG): container finished" podID="bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34" containerID="d269c041187f17142a7f398e8dcdd56bacc64d11ed080e651f3a102dcfac850e" exitCode=0 Apr 16 20:49:56.822512 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:49:56.822492 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" event={"ID":"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34","Type":"ContainerDied","Data":"d269c041187f17142a7f398e8dcdd56bacc64d11ed080e651f3a102dcfac850e"} Apr 16 20:50:01.847741 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:01.847698 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" event={"ID":"bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34","Type":"ContainerStarted","Data":"5b19051a160fbbc9efed8f3dfc81ce12c9efb01699e3903a1278d5f2c8477530"} Apr 16 20:50:01.848148 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:01.847913 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:50:01.874848 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:01.874794 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" podStartSLOduration=2.134561121 podStartE2EDuration="18.874781232s" podCreationTimestamp="2026-04-16 20:49:43 +0000 UTC" firstStartedPulling="2026-04-16 20:49:44.211257911 +0000 UTC m=+705.363353942" lastFinishedPulling="2026-04-16 20:50:00.951478011 +0000 UTC m=+722.103574053" observedRunningTime="2026-04-16 20:50:01.872492653 +0000 UTC m=+723.024588731" watchObservedRunningTime="2026-04-16 20:50:01.874781232 +0000 UTC m=+723.026877284" Apr 16 20:50:05.861849 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:05.861811 2575 generic.go:358] "Generic (PLEG): container finished" podID="da2c7c48-b450-456e-9eda-94e1fb15a151" containerID="a616f01459c11f126c0352bea4612ae1bd18525a027b8a61592a1f7c035c0a26" exitCode=0 Apr 16 20:50:05.862217 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:05.861882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" event={"ID":"da2c7c48-b450-456e-9eda-94e1fb15a151","Type":"ContainerDied","Data":"a616f01459c11f126c0352bea4612ae1bd18525a027b8a61592a1f7c035c0a26"} Apr 16 20:50:06.843829 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.843795 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc"] Apr 16 20:50:06.847393 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.847371 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:06.849574 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.849554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 16 20:50:06.857288 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.857260 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc"] Apr 16 20:50:06.867926 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.867900 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" event={"ID":"da2c7c48-b450-456e-9eda-94e1fb15a151","Type":"ContainerStarted","Data":"402ce0d9a4126ec1b5968b32394532c81d8d0790f0d3572bc13133162133d2d7"} Apr 16 20:50:06.868292 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.868150 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:50:06.884507 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.884467 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" podStartSLOduration=11.684431545 podStartE2EDuration="11.884453967s" podCreationTimestamp="2026-04-16 20:49:55 +0000 UTC" firstStartedPulling="2026-04-16 20:50:05.862443702 +0000 UTC m=+727.014539736" lastFinishedPulling="2026-04-16 20:50:06.062466124 +0000 UTC m=+727.214562158" observedRunningTime="2026-04-16 20:50:06.883580205 +0000 UTC m=+728.035676256" watchObservedRunningTime="2026-04-16 20:50:06.884453967 +0000 UTC m=+728.036550046" Apr 16 20:50:06.932185 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.932150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:06.932336 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.932205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4b5\" (UniqueName: \"kubernetes.io/projected/e550b2d5-51c7-4204-851f-bb41e5eba7fb-kube-api-access-hx4b5\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:06.932336 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.932241 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:06.932464 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.932329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:06.932464 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.932358 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e550b2d5-51c7-4204-851f-bb41e5eba7fb-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:06.932559 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:06.932463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.033469 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.033642 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.033642 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033509 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx4b5\" (UniqueName: \"kubernetes.io/projected/e550b2d5-51c7-4204-851f-bb41e5eba7fb-kube-api-access-hx4b5\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.033642 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.033968 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.034056 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e550b2d5-51c7-4204-851f-bb41e5eba7fb-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.034056 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033936 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.034056 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-model-cache\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.034189 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.033943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-home\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.036073 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.036053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e550b2d5-51c7-4204-851f-bb41e5eba7fb-dshm\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.036505 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.036486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e550b2d5-51c7-4204-851f-bb41e5eba7fb-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.040924 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.040905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx4b5\" (UniqueName: \"kubernetes.io/projected/e550b2d5-51c7-4204-851f-bb41e5eba7fb-kube-api-access-hx4b5\") pod \"e2e-trlp-test-simulated-kserve-6d5965695-7jbgc\" (UID: \"e550b2d5-51c7-4204-851f-bb41e5eba7fb\") " pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.158259 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.158176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:07.295545 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.295519 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc"] Apr 16 20:50:07.305163 ip-10-0-134-79 kubenswrapper[2575]: W0416 20:50:07.305114 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode550b2d5_51c7_4204_851f_bb41e5eba7fb.slice/crio-0e105aa8e3f7b5d86b53e5f03f634ea6f569894e4416877626d7849ba676b680 WatchSource:0}: Error finding container 0e105aa8e3f7b5d86b53e5f03f634ea6f569894e4416877626d7849ba676b680: Status 404 returned error can't find the container with id 0e105aa8e3f7b5d86b53e5f03f634ea6f569894e4416877626d7849ba676b680 Apr 16 20:50:07.873002 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.872966 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" event={"ID":"e550b2d5-51c7-4204-851f-bb41e5eba7fb","Type":"ContainerStarted","Data":"7343ff9a1a4eb70979fdaaab05741b18740a34ed911cbdc9dfe02c4602d0a274"} Apr 16 20:50:07.873002 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:07.873003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" event={"ID":"e550b2d5-51c7-4204-851f-bb41e5eba7fb","Type":"ContainerStarted","Data":"0e105aa8e3f7b5d86b53e5f03f634ea6f569894e4416877626d7849ba676b680"} Apr 16 20:50:12.863876 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:12.863844 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl" Apr 16 20:50:12.895394 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:12.895315 2575 generic.go:358] "Generic (PLEG): container finished" podID="e550b2d5-51c7-4204-851f-bb41e5eba7fb" containerID="7343ff9a1a4eb70979fdaaab05741b18740a34ed911cbdc9dfe02c4602d0a274" exitCode=0 Apr 16 20:50:12.895394 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:12.895358 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" event={"ID":"e550b2d5-51c7-4204-851f-bb41e5eba7fb","Type":"ContainerDied","Data":"7343ff9a1a4eb70979fdaaab05741b18740a34ed911cbdc9dfe02c4602d0a274"} Apr 16 20:50:13.909999 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:13.909964 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" event={"ID":"e550b2d5-51c7-4204-851f-bb41e5eba7fb","Type":"ContainerStarted","Data":"e5012223a42fa3f567605f7699c872aac72efe6fdec154b051e11ee564251cd2"} Apr 16 20:50:13.910393 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:13.910232 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:50:13.928970 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:13.928923 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" podStartSLOduration=7.495053513 podStartE2EDuration="7.928910108s" podCreationTimestamp="2026-04-16 20:50:06 +0000 UTC" firstStartedPulling="2026-04-16 20:50:12.895984002 +0000 UTC m=+734.048080039" lastFinishedPulling="2026-04-16 20:50:13.329840594 +0000 UTC m=+734.481936634" observedRunningTime="2026-04-16 20:50:13.926894869 +0000 UTC m=+735.078990922" watchObservedRunningTime="2026-04-16 20:50:13.928910108 +0000 UTC m=+735.081006160" Apr 16 20:50:17.885582 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:17.885548 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-85tcp" Apr 16 20:50:24.926986 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:50:24.926954 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-6d5965695-7jbgc" Apr 16 20:52:44.031467 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.031428 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-764b68564c-6rs2g"] Apr 16 20:52:44.031918 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.031683 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-764b68564c-6rs2g" podUID="e7ae7521-5349-4ffe-ae75-31b325bb8640" containerName="manager" containerID="cri-o://9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb" gracePeriod=10 Apr 16 20:52:44.274125 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.274103 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:52:44.380309 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.380281 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtf94\" (UniqueName: \"kubernetes.io/projected/e7ae7521-5349-4ffe-ae75-31b325bb8640-kube-api-access-wtf94\") pod \"e7ae7521-5349-4ffe-ae75-31b325bb8640\" (UID: \"e7ae7521-5349-4ffe-ae75-31b325bb8640\") " Apr 16 20:52:44.382288 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.382266 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ae7521-5349-4ffe-ae75-31b325bb8640-kube-api-access-wtf94" (OuterVolumeSpecName: "kube-api-access-wtf94") pod "e7ae7521-5349-4ffe-ae75-31b325bb8640" (UID: "e7ae7521-5349-4ffe-ae75-31b325bb8640"). InnerVolumeSpecName "kube-api-access-wtf94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:52:44.469726 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.469690 2575 generic.go:358] "Generic (PLEG): container finished" podID="e7ae7521-5349-4ffe-ae75-31b325bb8640" containerID="9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb" exitCode=0 Apr 16 20:52:44.469866 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.469750 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-764b68564c-6rs2g" Apr 16 20:52:44.469866 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.469777 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-764b68564c-6rs2g" event={"ID":"e7ae7521-5349-4ffe-ae75-31b325bb8640","Type":"ContainerDied","Data":"9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb"} Apr 16 20:52:44.469866 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.469815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-764b68564c-6rs2g" event={"ID":"e7ae7521-5349-4ffe-ae75-31b325bb8640","Type":"ContainerDied","Data":"2214378990a607ffeaef5e07781543f38b2564eb860f89e3311c103878be9510"} Apr 16 20:52:44.469866 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.469834 2575 scope.go:117] "RemoveContainer" containerID="9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb" Apr 16 20:52:44.478460 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.478440 2575 scope.go:117] "RemoveContainer" containerID="9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb" Apr 16 20:52:44.478763 ip-10-0-134-79 kubenswrapper[2575]: E0416 20:52:44.478742 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb\": container with ID starting with 9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb not found: ID does not exist" containerID="9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb" Apr 16 20:52:44.478825 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.478772 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb"} err="failed to get container status \"9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb\": rpc error: code = NotFound desc = could not find container \"9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb\": container with ID starting with 9ba03cf71620906e71c51f0efe3fbba3f7823bb9aad45bf00f70016cfc7ae2cb not found: ID does not exist" Apr 16 20:52:44.481195 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.481178 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wtf94\" (UniqueName: \"kubernetes.io/projected/e7ae7521-5349-4ffe-ae75-31b325bb8640-kube-api-access-wtf94\") on node \"ip-10-0-134-79.ec2.internal\" DevicePath \"\"" Apr 16 20:52:44.491890 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.491870 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-764b68564c-6rs2g"] Apr 16 20:52:44.493782 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:44.493765 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-764b68564c-6rs2g"] Apr 16 20:52:45.426457 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:45.426413 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ae7521-5349-4ffe-ae75-31b325bb8640" path="/var/lib/kubelet/pods/e7ae7521-5349-4ffe-ae75-31b325bb8640/volumes" Apr 16 20:52:59.371952 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:59.371926 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:52:59.374111 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:52:59.374085 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:57:59.398088 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:57:59.398059 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 20:57:59.401466 ip-10-0-134-79 kubenswrapper[2575]: I0416 20:57:59.401443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:02:59.425143 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:02:59.425114 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:02:59.432114 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:02:59.432094 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:07:59.451671 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:07:59.451644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:07:59.459855 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:07:59.459826 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:12:59.479053 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:12:59.479024 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:12:59.486871 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:12:59.486853 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sj6sh_98723067-9cd3-42a6-a577-2ecd3fc29ae9/ovn-acl-logging/0.log" Apr 16 21:13:43.991007 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:43.990972 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-6dwth_3ae7c350-6f35-4b15-9616-e9b40f9a6c7f/manager/0.log" Apr 16 21:13:44.113489 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:44.113458 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5c54fc4f76-htwxx_c7c8855e-db5b-4727-a24f-5f2d3a6e416e/maas-api/0.log" Apr 16 21:13:44.371103 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:44.371069 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-sqh8k_0c74987a-f7b9-4c73-be86-2ca4d965965e/manager/2.log" Apr 16 21:13:44.604177 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:44.604149 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf_8383b67d-5b7d-418f-bd5e-64050816cc24/manager/0.log" Apr 16 21:13:44.821675 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:44.821596 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-hzkrk_39552f32-9c95-4b91-87d8-62e0cc489b30/postgres/0.log" Apr 16 21:13:46.188175 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:46.188138 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-zbv5g_eb8d4c4b-51c3-489f-9ec0-32637206d9e0/manager/0.log" Apr 16 21:13:47.328201 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:47.328167 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-prkw5_e5e72258-f292-4870-9758-dbc28b87afc4/discovery/0.log" Apr 16 21:13:47.551481 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:47.551446 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-588879f674-jw4ml_1ea57899-46bc-4856-96b3-14087e5176ba/kube-auth-proxy/0.log" Apr 16 21:13:48.382261 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:48.382173 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-7jbgc_e550b2d5-51c7-4204-851f-bb41e5eba7fb/main/0.log" Apr 16 21:13:48.389835 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:48.389805 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-6d5965695-7jbgc_e550b2d5-51c7-4204-851f-bb41e5eba7fb/storage-initializer/0.log" Apr 16 21:13:48.501508 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:48.501482 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl_bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34/storage-initializer/0.log" Apr 16 21:13:48.508980 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:48.508953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcccsgkl_bf8aaf8d-3bcc-4bbf-a159-5cfc08bd1b34/main/0.log" Apr 16 21:13:48.735539 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:48.735467 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-85tcp_da2c7c48-b450-456e-9eda-94e1fb15a151/storage-initializer/0.log" Apr 16 21:13:48.743187 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:48.743159 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-85tcp_da2c7c48-b450-456e-9eda-94e1fb15a151/main/0.log" Apr 16 21:13:55.277118 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:55.277085 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hklnp_aca421fe-4ae3-4942-acd4-e16928e3c32d/global-pull-secret-syncer/0.log" Apr 16 21:13:55.411410 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:55.411360 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mhc66_8b75823b-b08a-433e-8dc2-46c97484a213/konnectivity-agent/0.log" Apr 16 21:13:55.488597 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:55.488567 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-79.ec2.internal_2b4e123e8fb6c71c1bd1455d7a290beb/haproxy/0.log" Apr 16 21:13:59.690018 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:13:59.689984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-zbv5g_eb8d4c4b-51c3-489f-9ec0-32637206d9e0/manager/0.log" Apr 16 21:14:01.480202 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.480167 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/alertmanager/0.log" Apr 16 21:14:01.507480 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.507442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/config-reloader/0.log" Apr 16 21:14:01.536012 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.535986 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/kube-rbac-proxy-web/0.log" Apr 16 21:14:01.563235 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.563172 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/kube-rbac-proxy/0.log" Apr 16 21:14:01.587781 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.587760 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/kube-rbac-proxy-metric/0.log" Apr 16 21:14:01.619572 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.619554 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/prom-label-proxy/0.log" Apr 16 21:14:01.643246 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.643215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_e9e47c0c-bfeb-4a60-b679-98cd214d053a/init-config-reloader/0.log" Apr 16 21:14:01.699168 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.699141 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-vn7tq_959a1948-886a-4bc5-bf25-72b0e2b30d8d/cluster-monitoring-operator/0.log" Apr 16 21:14:01.863475 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.863439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cfgsz_6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0/node-exporter/0.log" Apr 16 21:14:01.882945 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.882924 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cfgsz_6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0/kube-rbac-proxy/0.log" Apr 16 21:14:01.905989 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:01.905970 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cfgsz_6e75c8c4-70f1-4a2c-ae99-b103bc04e1b0/init-textfile/0.log" Apr 16 21:14:02.437878 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:02.437795 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56c74b8f5-kb62f_a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20/telemeter-client/0.log" Apr 16 21:14:02.459252 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:02.459229 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56c74b8f5-kb62f_a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20/reload/0.log" Apr 16 21:14:02.480233 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:02.480214 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-56c74b8f5-kb62f_a3f7ed98-6bf3-4e8f-8ccf-2974cfe91f20/kube-rbac-proxy/0.log" Apr 16 21:14:04.130668 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.130637 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps"] Apr 16 21:14:04.131067 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.131043 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7ae7521-5349-4ffe-ae75-31b325bb8640" containerName="manager" Apr 16 21:14:04.131067 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.131067 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ae7521-5349-4ffe-ae75-31b325bb8640" containerName="manager" Apr 16 21:14:04.131147 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.131122 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7ae7521-5349-4ffe-ae75-31b325bb8640" containerName="manager" Apr 16 21:14:04.134409 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.134386 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.136863 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.136841 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gr7vb\"/\"kube-root-ca.crt\"" Apr 16 21:14:04.136975 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.136885 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gr7vb\"/\"openshift-service-ca.crt\"" Apr 16 21:14:04.137694 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.137678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gr7vb\"/\"default-dockercfg-55qmz\"" Apr 16 21:14:04.140988 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.140969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps"] Apr 16 21:14:04.303151 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.303121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-sys\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.303328 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.303158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjcqq\" (UniqueName: \"kubernetes.io/projected/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-kube-api-access-qjcqq\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.303328 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.303244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-lib-modules\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.303328 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.303288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-proc\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.303328 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.303324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-podres\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404311 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-lib-modules\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404311 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404271 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-proc\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404311 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-podres\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404540 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-proc\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404540 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-sys\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404540 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404399 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-lib-modules\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404540 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjcqq\" (UniqueName: \"kubernetes.io/projected/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-kube-api-access-qjcqq\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404540 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-sys\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.404540 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.404474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-podres\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.413391 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.413368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjcqq\" (UniqueName: \"kubernetes.io/projected/a31c0fa8-3f7c-466d-aa0b-4284feb526ec-kube-api-access-qjcqq\") pod \"perf-node-gather-daemonset-cc7ps\" (UID: \"a31c0fa8-3f7c-466d-aa0b-4284feb526ec\") " pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.445379 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.445359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:04.566056 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.566027 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps"] Apr 16 21:14:04.567554 ip-10-0-134-79 kubenswrapper[2575]: W0416 21:14:04.567526 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda31c0fa8_3f7c_466d_aa0b_4284feb526ec.slice/crio-d2c5737b645f7a413e0b4b650fe744765d5d60568f6bc969d98998e13f919cc2 WatchSource:0}: Error finding container d2c5737b645f7a413e0b4b650fe744765d5d60568f6bc969d98998e13f919cc2: Status 404 returned error can't find the container with id d2c5737b645f7a413e0b4b650fe744765d5d60568f6bc969d98998e13f919cc2 Apr 16 21:14:04.569169 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.569144 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:14:04.741766 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.741744 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-wvhq9_4d826043-195a-403c-a1c0-b18a4ddf86fa/download-server/0.log" Apr 16 21:14:04.992670 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.992564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" event={"ID":"a31c0fa8-3f7c-466d-aa0b-4284feb526ec","Type":"ContainerStarted","Data":"c7a6b1f808c096645acc0380d8bd68d76e46dfbcc9392283a209e5ec8b430ac5"} Apr 16 21:14:04.992670 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.992602 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" event={"ID":"a31c0fa8-3f7c-466d-aa0b-4284feb526ec","Type":"ContainerStarted","Data":"d2c5737b645f7a413e0b4b650fe744765d5d60568f6bc969d98998e13f919cc2"} Apr 16 21:14:04.992670 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:04.992667 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:05.009978 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:05.009938 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" podStartSLOduration=1.009927467 podStartE2EDuration="1.009927467s" podCreationTimestamp="2026-04-16 21:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:14:05.008892573 +0000 UTC m=+2166.160988629" watchObservedRunningTime="2026-04-16 21:14:05.009927467 +0000 UTC m=+2166.162023519" Apr 16 21:14:06.047696 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:06.047665 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7lm8k_aa099128-e7b3-453f-a700-69d4e48f8448/dns/0.log" Apr 16 21:14:06.066836 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:06.066805 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7lm8k_aa099128-e7b3-453f-a700-69d4e48f8448/kube-rbac-proxy/0.log" Apr 16 21:14:06.233556 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:06.233523 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jq26v_838cdbbd-45af-4493-a167-65bd220c03c8/dns-node-resolver/0.log" Apr 16 21:14:06.706763 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:06.706726 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-58497df7cd-rdfqk_05edfaa7-98a2-4fd0-9840-27fc19cabc02/registry/0.log" Apr 16 21:14:06.751848 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:06.751824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5z2mk_c00ff0b8-9f9c-418d-854c-b22bc6be761f/node-ca/0.log" Apr 16 21:14:07.693186 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:07.693152 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-prkw5_e5e72258-f292-4870-9758-dbc28b87afc4/discovery/0.log" Apr 16 21:14:07.749564 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:07.749529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-588879f674-jw4ml_1ea57899-46bc-4856-96b3-14087e5176ba/kube-auth-proxy/0.log" Apr 16 21:14:08.396410 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:08.396381 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f9kkl_35692de4-3b87-4697-b519-4f55d1e81778/serve-healthcheck-canary/0.log" Apr 16 21:14:08.860040 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:08.860012 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-x6rb4_ca797f13-b8b1-4f9e-8374-336cc1c934f4/insights-operator/0.log" Apr 16 21:14:08.860438 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:08.860193 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-x6rb4_ca797f13-b8b1-4f9e-8374-336cc1c934f4/insights-operator/1.log" Apr 16 21:14:08.956853 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:08.956832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5xct_a3cf8644-3194-4619-b219-a5991ba494bc/kube-rbac-proxy/0.log" Apr 16 21:14:08.977695 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:08.977676 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5xct_a3cf8644-3194-4619-b219-a5991ba494bc/exporter/0.log" Apr 16 21:14:08.999663 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:08.999639 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s5xct_a3cf8644-3194-4619-b219-a5991ba494bc/extractor/0.log" Apr 16 21:14:11.009376 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.009344 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gr7vb/perf-node-gather-daemonset-cc7ps" Apr 16 21:14:11.044053 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.044023 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-6dwth_3ae7c350-6f35-4b15-9616-e9b40f9a6c7f/manager/0.log" Apr 16 21:14:11.086632 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.086591 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5c54fc4f76-htwxx_c7c8855e-db5b-4727-a24f-5f2d3a6e416e/maas-api/0.log" Apr 16 21:14:11.168644 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.168602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-sqh8k_0c74987a-f7b9-4c73-be86-2ca4d965965e/manager/1.log" Apr 16 21:14:11.188869 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.188843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-sqh8k_0c74987a-f7b9-4c73-be86-2ca4d965965e/manager/2.log" Apr 16 21:14:11.255482 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.255448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7cd8df7dd5-k9tbf_8383b67d-5b7d-418f-bd5e-64050816cc24/manager/0.log" Apr 16 21:14:11.307317 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:11.307221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-hzkrk_39552f32-9c95-4b91-87d8-62e0cc489b30/postgres/0.log" Apr 16 21:14:12.517108 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:12.516923 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-6q45h_33b595a5-5518-4df2-abf2-db9264869040/openshift-lws-operator/0.log" Apr 16 21:14:16.935235 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:16.935192 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vrpqp_01c82200-8490-468e-a926-734a11ac86ca/migrator/0.log" Apr 16 21:14:16.960448 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:16.960425 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-vrpqp_01c82200-8490-468e-a926-734a11ac86ca/graceful-termination/0.log" Apr 16 21:14:18.636356 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.636326 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/kube-multus-additional-cni-plugins/0.log" Apr 16 21:14:18.748674 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.748646 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/egress-router-binary-copy/0.log" Apr 16 21:14:18.776833 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.776810 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/cni-plugins/0.log" Apr 16 21:14:18.804943 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.804921 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/bond-cni-plugin/0.log" Apr 16 21:14:18.831876 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.831854 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/routeoverride-cni/0.log" Apr 16 21:14:18.878480 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.878456 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/whereabouts-cni-bincopy/0.log" Apr 16 21:14:18.925318 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:18.925267 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jltbp_03f24485-95a9-4251-9d14-8bcb63f82514/whereabouts-cni/0.log" Apr 16 21:14:19.484402 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:19.484367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ghlvc_64e562da-5321-4973-a365-e8c0d198b8cc/kube-multus/0.log" Apr 16 21:14:19.510382 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:19.510307 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5jhhm_422c9f50-4f45-46bc-9e9d-5c4f1c20c115/network-metrics-daemon/0.log" Apr 16 21:14:19.533319 ip-10-0-134-79 kubenswrapper[2575]: I0416 21:14:19.533289 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5jhhm_422c9f50-4f45-46bc-9e9d-5c4f1c20c115/kube-rbac-proxy/0.log"